Jan 24 00:44:39.834776 kernel: Linux version 6.12.66-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.1_p20250801 p4) 14.3.1 20250801, GNU ld (Gentoo 2.45 p3) 2.45.0) #1 SMP PREEMPT_DYNAMIC Fri Jan 23 21:38:55 -00 2026 Jan 24 00:44:39.834795 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=ccc6714d5701627f00a0daea097f593263f2ea87c850869ae25db66d36e22877 Jan 24 00:44:39.834802 kernel: BIOS-provided physical RAM map: Jan 24 00:44:39.834807 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ffff] usable Jan 24 00:44:39.834814 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007ed3efff] usable Jan 24 00:44:39.834819 kernel: BIOS-e820: [mem 0x000000007ed3f000-0x000000007edfffff] reserved Jan 24 00:44:39.834825 kernel: BIOS-e820: [mem 0x000000007ee00000-0x000000007f8ecfff] usable Jan 24 00:44:39.834830 kernel: BIOS-e820: [mem 0x000000007f8ed000-0x000000007fb6cfff] reserved Jan 24 00:44:39.834835 kernel: BIOS-e820: [mem 0x000000007fb6d000-0x000000007fb7efff] ACPI data Jan 24 00:44:39.834840 kernel: BIOS-e820: [mem 0x000000007fb7f000-0x000000007fbfefff] ACPI NVS Jan 24 00:44:39.834845 kernel: BIOS-e820: [mem 0x000000007fbff000-0x000000007ff7bfff] usable Jan 24 00:44:39.834850 kernel: BIOS-e820: [mem 0x000000007ff7c000-0x000000007fffffff] reserved Jan 24 00:44:39.834857 kernel: BIOS-e820: [mem 0x00000000e0000000-0x00000000efffffff] reserved Jan 24 00:44:39.834862 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Jan 24 00:44:39.834867 kernel: BIOS-e820: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Jan 24 00:44:39.834873 kernel: BIOS-e820: [mem 0x0000000100000000-0x0000000179ffffff] usable Jan 24 00:44:39.834878 kernel: BIOS-e820: [mem 0x000000fd00000000-0x000000ffffffffff] reserved Jan 24 00:44:39.834885 kernel: NX (Execute Disable) protection: active Jan 24 00:44:39.834890 kernel: APIC: Static calls initialized Jan 24 00:44:39.834898 kernel: e820: update [mem 0x7dfad018-0x7dfb6a57] usable ==> usable Jan 24 00:44:39.834907 kernel: e820: update [mem 0x7df71018-0x7dfac657] usable ==> usable Jan 24 00:44:39.834914 kernel: e820: update [mem 0x7df35018-0x7df70657] usable ==> usable Jan 24 00:44:39.834922 kernel: extended physical RAM map: Jan 24 00:44:39.834930 kernel: reserve setup_data: [mem 0x0000000000000000-0x000000000009ffff] usable Jan 24 00:44:39.834935 kernel: reserve setup_data: [mem 0x0000000000100000-0x000000007df35017] usable Jan 24 00:44:39.834940 kernel: reserve setup_data: [mem 0x000000007df35018-0x000000007df70657] usable Jan 24 00:44:39.834945 kernel: reserve setup_data: [mem 0x000000007df70658-0x000000007df71017] usable Jan 24 00:44:39.834953 kernel: reserve setup_data: [mem 0x000000007df71018-0x000000007dfac657] usable Jan 24 00:44:39.834958 kernel: reserve setup_data: [mem 0x000000007dfac658-0x000000007dfad017] usable Jan 24 00:44:39.834963 kernel: reserve setup_data: [mem 0x000000007dfad018-0x000000007dfb6a57] usable Jan 24 00:44:39.834968 kernel: reserve setup_data: [mem 0x000000007dfb6a58-0x000000007ed3efff] usable Jan 24 00:44:39.834973 kernel: reserve setup_data: [mem 0x000000007ed3f000-0x000000007edfffff] reserved Jan 24 00:44:39.834990 kernel: reserve setup_data: [mem 0x000000007ee00000-0x000000007f8ecfff] usable Jan 24 00:44:39.834996 kernel: reserve setup_data: [mem 0x000000007f8ed000-0x000000007fb6cfff] reserved Jan 24 00:44:39.835001 kernel: reserve setup_data: [mem 0x000000007fb6d000-0x000000007fb7efff] ACPI data Jan 24 00:44:39.835011 kernel: reserve setup_data: [mem 0x000000007fb7f000-0x000000007fbfefff] ACPI NVS Jan 24 00:44:39.835017 kernel: reserve setup_data: [mem 0x000000007fbff000-0x000000007ff7bfff] usable Jan 24 00:44:39.835022 kernel: reserve setup_data: [mem 0x000000007ff7c000-0x000000007fffffff] reserved Jan 24 00:44:39.835029 kernel: reserve setup_data: [mem 0x00000000e0000000-0x00000000efffffff] reserved Jan 24 00:44:39.835034 kernel: reserve setup_data: [mem 0x00000000feffc000-0x00000000feffffff] reserved Jan 24 00:44:39.835043 kernel: reserve setup_data: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Jan 24 00:44:39.835048 kernel: reserve setup_data: [mem 0x0000000100000000-0x0000000179ffffff] usable Jan 24 00:44:39.835055 kernel: reserve setup_data: [mem 0x000000fd00000000-0x000000ffffffffff] reserved Jan 24 00:44:39.835061 kernel: efi: EFI v2.7 by Ubuntu distribution of EDK II Jan 24 00:44:39.835067 kernel: efi: SMBIOS=0x7f988000 SMBIOS 3.0=0x7f986000 ACPI=0x7fb7e000 ACPI 2.0=0x7fb7e014 MEMATTR=0x7e845198 RNG=0x7fb73018 Jan 24 00:44:39.835072 kernel: random: crng init done Jan 24 00:44:39.835078 kernel: efi: Remove mem136: MMIO range=[0xffc00000-0xffffffff] (4MB) from e820 map Jan 24 00:44:39.835084 kernel: e820: remove [mem 0xffc00000-0xffffffff] reserved Jan 24 00:44:39.835089 kernel: secureboot: Secure boot disabled Jan 24 00:44:39.835094 kernel: SMBIOS 3.0.0 present. Jan 24 00:44:39.835100 kernel: DMI: Hetzner vServer/Standard PC (Q35 + ICH9, 2009), BIOS 20171111 11/11/2017 Jan 24 00:44:39.835107 kernel: DMI: Memory slots populated: 1/1 Jan 24 00:44:39.835113 kernel: Hypervisor detected: KVM Jan 24 00:44:39.835122 kernel: last_pfn = 0x7ff7c max_arch_pfn = 0x10000000000 Jan 24 00:44:39.835128 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Jan 24 00:44:39.835133 kernel: kvm-clock: using sched offset of 13115068922 cycles Jan 24 00:44:39.835139 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Jan 24 00:44:39.835145 kernel: tsc: Detected 2399.998 MHz processor Jan 24 00:44:39.835151 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Jan 24 00:44:39.835157 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Jan 24 00:44:39.835163 kernel: last_pfn = 0x17a000 max_arch_pfn = 0x10000000000 Jan 24 00:44:39.835171 kernel: MTRR map: 4 entries (2 fixed + 2 variable; max 18), built from 8 variable MTRRs Jan 24 00:44:39.835177 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Jan 24 00:44:39.835182 kernel: last_pfn = 0x7ff7c max_arch_pfn = 0x10000000000 Jan 24 00:44:39.835188 kernel: Using GB pages for direct mapping Jan 24 00:44:39.835194 kernel: ACPI: Early table checksum verification disabled Jan 24 00:44:39.835199 kernel: ACPI: RSDP 0x000000007FB7E014 000024 (v02 BOCHS ) Jan 24 00:44:39.835205 kernel: ACPI: XSDT 0x000000007FB7D0E8 000054 (v01 BOCHS BXPC 00000001 01000013) Jan 24 00:44:39.835213 kernel: ACPI: FACP 0x000000007FB79000 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 24 00:44:39.835219 kernel: ACPI: DSDT 0x000000007FB7A000 002443 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 24 00:44:39.835225 kernel: ACPI: FACS 0x000000007FBDD000 000040 Jan 24 00:44:39.835230 kernel: ACPI: APIC 0x000000007FB78000 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 24 00:44:39.835236 kernel: ACPI: HPET 0x000000007FB77000 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 24 00:44:39.835242 kernel: ACPI: MCFG 0x000000007FB76000 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 24 00:44:39.835248 kernel: ACPI: WAET 0x000000007FB75000 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 24 00:44:39.835256 kernel: ACPI: BGRT 0x000000007FB74000 000038 (v01 INTEL EDK2 00000002 01000013) Jan 24 00:44:39.835261 kernel: ACPI: Reserving FACP table memory at [mem 0x7fb79000-0x7fb790f3] Jan 24 00:44:39.835267 kernel: ACPI: Reserving DSDT table memory at [mem 0x7fb7a000-0x7fb7c442] Jan 24 00:44:39.835273 kernel: ACPI: Reserving FACS table memory at [mem 0x7fbdd000-0x7fbdd03f] Jan 24 00:44:39.835279 kernel: ACPI: Reserving APIC table memory at [mem 0x7fb78000-0x7fb7807f] Jan 24 00:44:39.835284 kernel: ACPI: Reserving HPET table memory at [mem 0x7fb77000-0x7fb77037] Jan 24 00:44:39.835290 kernel: ACPI: Reserving MCFG table memory at [mem 0x7fb76000-0x7fb7603b] Jan 24 00:44:39.835298 kernel: ACPI: Reserving WAET table memory at [mem 0x7fb75000-0x7fb75027] Jan 24 00:44:39.835303 kernel: ACPI: Reserving BGRT table memory at [mem 0x7fb74000-0x7fb74037] Jan 24 00:44:39.835309 kernel: No NUMA configuration found Jan 24 00:44:39.835315 kernel: Faking a node at [mem 0x0000000000000000-0x0000000179ffffff] Jan 24 00:44:39.835321 kernel: NODE_DATA(0) allocated [mem 0x179ff6dc0-0x179ffdfff] Jan 24 00:44:39.835326 kernel: Zone ranges: Jan 24 00:44:39.835332 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Jan 24 00:44:39.835337 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Jan 24 00:44:39.835345 kernel: Normal [mem 0x0000000100000000-0x0000000179ffffff] Jan 24 00:44:39.835351 kernel: Device empty Jan 24 00:44:39.835357 kernel: Movable zone start for each node Jan 24 00:44:39.835362 kernel: Early memory node ranges Jan 24 00:44:39.835368 kernel: node 0: [mem 0x0000000000001000-0x000000000009ffff] Jan 24 00:44:39.835373 kernel: node 0: [mem 0x0000000000100000-0x000000007ed3efff] Jan 24 00:44:39.835379 kernel: node 0: [mem 0x000000007ee00000-0x000000007f8ecfff] Jan 24 00:44:39.835385 kernel: node 0: [mem 0x000000007fbff000-0x000000007ff7bfff] Jan 24 00:44:39.835393 kernel: node 0: [mem 0x0000000100000000-0x0000000179ffffff] Jan 24 00:44:39.835398 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x0000000179ffffff] Jan 24 00:44:39.835404 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Jan 24 00:44:39.835410 kernel: On node 0, zone DMA: 96 pages in unavailable ranges Jan 24 00:44:39.835415 kernel: On node 0, zone DMA32: 193 pages in unavailable ranges Jan 24 00:44:39.835432 kernel: On node 0, zone DMA32: 786 pages in unavailable ranges Jan 24 00:44:39.835438 kernel: On node 0, zone Normal: 132 pages in unavailable ranges Jan 24 00:44:39.835457 kernel: On node 0, zone Normal: 24576 pages in unavailable ranges Jan 24 00:44:39.835462 kernel: ACPI: PM-Timer IO Port: 0x608 Jan 24 00:44:39.835468 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Jan 24 00:44:39.835474 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Jan 24 00:44:39.835479 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Jan 24 00:44:39.835492 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Jan 24 00:44:39.835498 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Jan 24 00:44:39.835506 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Jan 24 00:44:39.835511 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Jan 24 00:44:39.835517 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Jan 24 00:44:39.835523 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Jan 24 00:44:39.835529 kernel: CPU topo: Max. logical packages: 1 Jan 24 00:44:39.835535 kernel: CPU topo: Max. logical dies: 1 Jan 24 00:44:39.835548 kernel: CPU topo: Max. dies per package: 1 Jan 24 00:44:39.835554 kernel: CPU topo: Max. threads per core: 1 Jan 24 00:44:39.835560 kernel: CPU topo: Num. cores per package: 2 Jan 24 00:44:39.835566 kernel: CPU topo: Num. threads per package: 2 Jan 24 00:44:39.835574 kernel: CPU topo: Allowing 2 present CPUs plus 0 hotplug CPUs Jan 24 00:44:39.835579 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Jan 24 00:44:39.835585 kernel: [mem 0x80000000-0xdfffffff] available for PCI devices Jan 24 00:44:39.835591 kernel: Booting paravirtualized kernel on KVM Jan 24 00:44:39.835597 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Jan 24 00:44:39.835606 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Jan 24 00:44:39.835612 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u1048576 Jan 24 00:44:39.835618 kernel: pcpu-alloc: s207832 r8192 d29736 u1048576 alloc=1*2097152 Jan 24 00:44:39.835624 kernel: pcpu-alloc: [0] 0 1 Jan 24 00:44:39.835629 kernel: kvm-guest: PV spinlocks disabled, no host support Jan 24 00:44:39.835636 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=ccc6714d5701627f00a0daea097f593263f2ea87c850869ae25db66d36e22877 Jan 24 00:44:39.835645 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Jan 24 00:44:39.835651 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Jan 24 00:44:39.835657 kernel: Fallback order for Node 0: 0 Jan 24 00:44:39.835663 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1022792 Jan 24 00:44:39.835669 kernel: Policy zone: Normal Jan 24 00:44:39.835675 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Jan 24 00:44:39.835681 kernel: software IO TLB: area num 2. Jan 24 00:44:39.835689 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Jan 24 00:44:39.835695 kernel: ftrace: allocating 40128 entries in 157 pages Jan 24 00:44:39.835701 kernel: ftrace: allocated 157 pages with 5 groups Jan 24 00:44:39.835706 kernel: Dynamic Preempt: voluntary Jan 24 00:44:39.835712 kernel: rcu: Preemptible hierarchical RCU implementation. Jan 24 00:44:39.835719 kernel: rcu: RCU event tracing is enabled. Jan 24 00:44:39.835725 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Jan 24 00:44:39.835731 kernel: Trampoline variant of Tasks RCU enabled. Jan 24 00:44:39.835739 kernel: Rude variant of Tasks RCU enabled. Jan 24 00:44:39.835745 kernel: Tracing variant of Tasks RCU enabled. Jan 24 00:44:39.835751 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Jan 24 00:44:39.835757 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Jan 24 00:44:39.835764 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jan 24 00:44:39.835770 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jan 24 00:44:39.835776 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jan 24 00:44:39.835784 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 Jan 24 00:44:39.835790 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Jan 24 00:44:39.835796 kernel: Console: colour dummy device 80x25 Jan 24 00:44:39.835802 kernel: printk: legacy console [tty0] enabled Jan 24 00:44:39.835808 kernel: printk: legacy console [ttyS0] enabled Jan 24 00:44:39.835814 kernel: ACPI: Core revision 20240827 Jan 24 00:44:39.835820 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns Jan 24 00:44:39.835828 kernel: APIC: Switch to symmetric I/O mode setup Jan 24 00:44:39.835834 kernel: x2apic enabled Jan 24 00:44:39.835840 kernel: APIC: Switched APIC routing to: physical x2apic Jan 24 00:44:39.835846 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Jan 24 00:44:39.835852 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x229835b7123, max_idle_ns: 440795242976 ns Jan 24 00:44:39.835858 kernel: Calibrating delay loop (skipped) preset value.. 4799.99 BogoMIPS (lpj=2399998) Jan 24 00:44:39.835864 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Jan 24 00:44:39.835870 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 Jan 24 00:44:39.835878 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 Jan 24 00:44:39.835884 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Jan 24 00:44:39.835890 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Jan 24 00:44:39.835896 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Jan 24 00:44:39.835902 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Jan 24 00:44:39.835908 kernel: active return thunk: srso_alias_return_thunk Jan 24 00:44:39.835916 kernel: Speculative Return Stack Overflow: Mitigation: Safe RET Jan 24 00:44:39.835922 kernel: Transient Scheduler Attacks: Forcing mitigation on in a VM Jan 24 00:44:39.835928 kernel: Transient Scheduler Attacks: Vulnerable: Clear CPU buffers attempted, no microcode Jan 24 00:44:39.835934 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Jan 24 00:44:39.835940 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Jan 24 00:44:39.835946 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Jan 24 00:44:39.835952 kernel: x86/fpu: Supporting XSAVE feature 0x020: 'AVX-512 opmask' Jan 24 00:44:39.835960 kernel: x86/fpu: Supporting XSAVE feature 0x040: 'AVX-512 Hi256' Jan 24 00:44:39.835966 kernel: x86/fpu: Supporting XSAVE feature 0x080: 'AVX-512 ZMM_Hi256' Jan 24 00:44:39.835972 kernel: x86/fpu: Supporting XSAVE feature 0x200: 'Protection Keys User registers' Jan 24 00:44:39.835978 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Jan 24 00:44:39.835984 kernel: x86/fpu: xstate_offset[5]: 832, xstate_sizes[5]: 64 Jan 24 00:44:39.835990 kernel: x86/fpu: xstate_offset[6]: 896, xstate_sizes[6]: 512 Jan 24 00:44:39.835996 kernel: x86/fpu: xstate_offset[7]: 1408, xstate_sizes[7]: 1024 Jan 24 00:44:39.836002 kernel: x86/fpu: xstate_offset[9]: 2432, xstate_sizes[9]: 8 Jan 24 00:44:39.836010 kernel: x86/fpu: Enabled xstate features 0x2e7, context size is 2440 bytes, using 'compacted' format. Jan 24 00:44:39.836016 kernel: Freeing SMP alternatives memory: 32K Jan 24 00:44:39.836023 kernel: pid_max: default: 32768 minimum: 301 Jan 24 00:44:39.836029 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Jan 24 00:44:39.836035 kernel: landlock: Up and running. Jan 24 00:44:39.836041 kernel: SELinux: Initializing. Jan 24 00:44:39.836047 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Jan 24 00:44:39.836055 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Jan 24 00:44:39.836061 kernel: smpboot: CPU0: AMD EPYC-Genoa Processor (family: 0x19, model: 0x11, stepping: 0x0) Jan 24 00:44:39.836067 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. Jan 24 00:44:39.836072 kernel: ... version: 0 Jan 24 00:44:39.836078 kernel: ... bit width: 48 Jan 24 00:44:39.836084 kernel: ... generic registers: 6 Jan 24 00:44:39.836090 kernel: ... value mask: 0000ffffffffffff Jan 24 00:44:39.836098 kernel: ... max period: 00007fffffffffff Jan 24 00:44:39.836104 kernel: ... fixed-purpose events: 0 Jan 24 00:44:39.836110 kernel: ... event mask: 000000000000003f Jan 24 00:44:39.836116 kernel: signal: max sigframe size: 3376 Jan 24 00:44:39.836122 kernel: rcu: Hierarchical SRCU implementation. Jan 24 00:44:39.836128 kernel: rcu: Max phase no-delay instances is 400. Jan 24 00:44:39.836134 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Jan 24 00:44:39.836142 kernel: smp: Bringing up secondary CPUs ... Jan 24 00:44:39.836147 kernel: smpboot: x86: Booting SMP configuration: Jan 24 00:44:39.836153 kernel: .... node #0, CPUs: #1 Jan 24 00:44:39.836159 kernel: smp: Brought up 1 node, 2 CPUs Jan 24 00:44:39.836165 kernel: smpboot: Total of 2 processors activated (9599.99 BogoMIPS) Jan 24 00:44:39.836171 kernel: Memory: 3873088K/4091168K available (14336K kernel code, 2445K rwdata, 31644K rodata, 15540K init, 2496K bss, 212448K reserved, 0K cma-reserved) Jan 24 00:44:39.836177 kernel: devtmpfs: initialized Jan 24 00:44:39.836185 kernel: x86/mm: Memory block size: 128MB Jan 24 00:44:39.836191 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7fb7f000-0x7fbfefff] (524288 bytes) Jan 24 00:44:39.836197 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Jan 24 00:44:39.836204 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Jan 24 00:44:39.836210 kernel: pinctrl core: initialized pinctrl subsystem Jan 24 00:44:39.836216 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Jan 24 00:44:39.836221 kernel: audit: initializing netlink subsys (disabled) Jan 24 00:44:39.836229 kernel: audit: type=2000 audit(1769215474.797:1): state=initialized audit_enabled=0 res=1 Jan 24 00:44:39.836235 kernel: thermal_sys: Registered thermal governor 'step_wise' Jan 24 00:44:39.836242 kernel: thermal_sys: Registered thermal governor 'user_space' Jan 24 00:44:39.836247 kernel: cpuidle: using governor menu Jan 24 00:44:39.836253 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Jan 24 00:44:39.836259 kernel: dca service started, version 1.12.1 Jan 24 00:44:39.836265 kernel: PCI: ECAM [mem 0xe0000000-0xefffffff] (base 0xe0000000) for domain 0000 [bus 00-ff] Jan 24 00:44:39.836273 kernel: PCI: Using configuration type 1 for base access Jan 24 00:44:39.836279 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Jan 24 00:44:39.836285 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Jan 24 00:44:39.836291 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Jan 24 00:44:39.836297 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Jan 24 00:44:39.836303 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Jan 24 00:44:39.836309 kernel: ACPI: Added _OSI(Module Device) Jan 24 00:44:39.836317 kernel: ACPI: Added _OSI(Processor Device) Jan 24 00:44:39.836323 kernel: ACPI: Added _OSI(Processor Aggregator Device) Jan 24 00:44:39.836329 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Jan 24 00:44:39.836335 kernel: ACPI: Interpreter enabled Jan 24 00:44:39.836341 kernel: ACPI: PM: (supports S0 S5) Jan 24 00:44:39.836347 kernel: ACPI: Using IOAPIC for interrupt routing Jan 24 00:44:39.836353 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Jan 24 00:44:39.836359 kernel: PCI: Using E820 reservations for host bridge windows Jan 24 00:44:39.836368 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Jan 24 00:44:39.836373 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Jan 24 00:44:39.836622 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Jan 24 00:44:39.836776 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Jan 24 00:44:39.836957 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Jan 24 00:44:39.836969 kernel: PCI host bridge to bus 0000:00 Jan 24 00:44:39.837116 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Jan 24 00:44:39.837250 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Jan 24 00:44:39.837382 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Jan 24 00:44:39.837548 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xdfffffff window] Jan 24 00:44:39.838046 kernel: pci_bus 0000:00: root bus resource [mem 0xf0000000-0xfebfffff window] Jan 24 00:44:39.838188 kernel: pci_bus 0000:00: root bus resource [mem 0xc000000000-0xc7ffffffff window] Jan 24 00:44:39.839787 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Jan 24 00:44:39.840965 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint Jan 24 00:44:39.841136 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint Jan 24 00:44:39.843593 kernel: pci 0000:00:01.0: BAR 0 [mem 0x80000000-0x807fffff pref] Jan 24 00:44:39.843753 kernel: pci 0000:00:01.0: BAR 2 [mem 0xc060500000-0xc060503fff 64bit pref] Jan 24 00:44:39.843904 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8138a000-0x8138afff] Jan 24 00:44:39.844049 kernel: pci 0000:00:01.0: ROM [mem 0xffff0000-0xffffffff pref] Jan 24 00:44:39.844191 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Jan 24 00:44:39.844341 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 24 00:44:39.844504 kernel: pci 0000:00:02.0: BAR 0 [mem 0x81389000-0x81389fff] Jan 24 00:44:39.844649 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Jan 24 00:44:39.844790 kernel: pci 0000:00:02.0: bridge window [mem 0x81200000-0x812fffff] Jan 24 00:44:39.844932 kernel: pci 0000:00:02.0: bridge window [mem 0xc060000000-0xc0600fffff 64bit pref] Jan 24 00:44:39.845082 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 24 00:44:39.845224 kernel: pci 0000:00:02.1: BAR 0 [mem 0x81388000-0x81388fff] Jan 24 00:44:39.845367 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Jan 24 00:44:39.846958 kernel: pci 0000:00:02.1: bridge window [mem 0x81100000-0x811fffff] Jan 24 00:44:39.847113 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 24 00:44:39.847256 kernel: pci 0000:00:02.2: BAR 0 [mem 0x81387000-0x81387fff] Jan 24 00:44:39.847398 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Jan 24 00:44:39.847568 kernel: pci 0000:00:02.2: bridge window [mem 0x81000000-0x810fffff] Jan 24 00:44:39.847714 kernel: pci 0000:00:02.2: bridge window [mem 0xc060100000-0xc0601fffff 64bit pref] Jan 24 00:44:39.847862 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 24 00:44:39.848020 kernel: pci 0000:00:02.3: BAR 0 [mem 0x81386000-0x81386fff] Jan 24 00:44:39.848174 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Jan 24 00:44:39.848315 kernel: pci 0000:00:02.3: bridge window [mem 0xc060200000-0xc0602fffff 64bit pref] Jan 24 00:44:39.848495 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 24 00:44:39.848644 kernel: pci 0000:00:02.4: BAR 0 [mem 0x81385000-0x81385fff] Jan 24 00:44:39.848783 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Jan 24 00:44:39.848923 kernel: pci 0000:00:02.4: bridge window [mem 0x80f00000-0x80ffffff] Jan 24 00:44:39.849063 kernel: pci 0000:00:02.4: bridge window [mem 0xc060300000-0xc0603fffff 64bit pref] Jan 24 00:44:39.849210 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 24 00:44:39.849352 kernel: pci 0000:00:02.5: BAR 0 [mem 0x81384000-0x81384fff] Jan 24 00:44:39.849517 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Jan 24 00:44:39.849657 kernel: pci 0000:00:02.5: bridge window [mem 0x80e00000-0x80efffff] Jan 24 00:44:39.849796 kernel: pci 0000:00:02.5: bridge window [mem 0xc060400000-0xc0604fffff 64bit pref] Jan 24 00:44:39.849941 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 24 00:44:39.850081 kernel: pci 0000:00:02.6: BAR 0 [mem 0x81383000-0x81383fff] Jan 24 00:44:39.850220 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Jan 24 00:44:39.850362 kernel: pci 0000:00:02.6: bridge window [mem 0x80c00000-0x80dfffff] Jan 24 00:44:39.851889 kernel: pci 0000:00:02.6: bridge window [mem 0xc000000000-0xc01fffffff 64bit pref] Jan 24 00:44:39.852099 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 24 00:44:39.852296 kernel: pci 0000:00:02.7: BAR 0 [mem 0x81382000-0x81382fff] Jan 24 00:44:39.852466 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Jan 24 00:44:39.852630 kernel: pci 0000:00:02.7: bridge window [mem 0x80a00000-0x80bfffff] Jan 24 00:44:39.852772 kernel: pci 0000:00:02.7: bridge window [mem 0xc020000000-0xc03fffffff 64bit pref] Jan 24 00:44:39.853049 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 24 00:44:39.853194 kernel: pci 0000:00:03.0: BAR 0 [mem 0x81381000-0x81381fff] Jan 24 00:44:39.853335 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Jan 24 00:44:39.854522 kernel: pci 0000:00:03.0: bridge window [mem 0x80800000-0x809fffff] Jan 24 00:44:39.854682 kernel: pci 0000:00:03.0: bridge window [mem 0xc040000000-0xc05fffffff 64bit pref] Jan 24 00:44:39.854831 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint Jan 24 00:44:39.854981 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Jan 24 00:44:39.855130 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint Jan 24 00:44:39.855271 kernel: pci 0000:00:1f.2: BAR 4 [io 0x6040-0x605f] Jan 24 00:44:39.855413 kernel: pci 0000:00:1f.2: BAR 5 [mem 0x81380000-0x81380fff] Jan 24 00:44:39.855585 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint Jan 24 00:44:39.855727 kernel: pci 0000:00:1f.3: BAR 4 [io 0x6000-0x603f] Jan 24 00:44:39.855880 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Jan 24 00:44:39.856025 kernel: pci 0000:01:00.0: BAR 1 [mem 0x81200000-0x81200fff] Jan 24 00:44:39.856169 kernel: pci 0000:01:00.0: BAR 4 [mem 0xc060000000-0xc060003fff 64bit pref] Jan 24 00:44:39.856316 kernel: pci 0000:01:00.0: ROM [mem 0xfff80000-0xffffffff pref] Jan 24 00:44:39.857899 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Jan 24 00:44:39.858069 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Jan 24 00:44:39.858229 kernel: pci 0000:02:00.0: BAR 0 [mem 0x81100000-0x81103fff 64bit] Jan 24 00:44:39.858372 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Jan 24 00:44:39.858555 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 PCIe Endpoint Jan 24 00:44:39.858708 kernel: pci 0000:03:00.0: BAR 1 [mem 0x81000000-0x81000fff] Jan 24 00:44:39.858853 kernel: pci 0000:03:00.0: BAR 4 [mem 0xc060100000-0xc060103fff 64bit pref] Jan 24 00:44:39.859004 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Jan 24 00:44:39.859157 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Jan 24 00:44:39.859303 kernel: pci 0000:04:00.0: BAR 4 [mem 0xc060200000-0xc060203fff 64bit pref] Jan 24 00:44:39.861465 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Jan 24 00:44:39.861646 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Jan 24 00:44:39.861795 kernel: pci 0000:05:00.0: BAR 1 [mem 0x80f00000-0x80f00fff] Jan 24 00:44:39.861941 kernel: pci 0000:05:00.0: BAR 4 [mem 0xc060300000-0xc060303fff 64bit pref] Jan 24 00:44:39.862082 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Jan 24 00:44:39.862233 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 PCIe Endpoint Jan 24 00:44:39.862383 kernel: pci 0000:06:00.0: BAR 1 [mem 0x80e00000-0x80e00fff] Jan 24 00:44:39.862555 kernel: pci 0000:06:00.0: BAR 4 [mem 0xc060400000-0xc060403fff 64bit pref] Jan 24 00:44:39.862697 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Jan 24 00:44:39.862705 kernel: acpiphp: Slot [0] registered Jan 24 00:44:39.862855 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Jan 24 00:44:39.863009 kernel: pci 0000:07:00.0: BAR 1 [mem 0x80c00000-0x80c00fff] Jan 24 00:44:39.863158 kernel: pci 0000:07:00.0: BAR 4 [mem 0xc000000000-0xc000003fff 64bit pref] Jan 24 00:44:39.863302 kernel: pci 0000:07:00.0: ROM [mem 0xfff80000-0xffffffff pref] Jan 24 00:44:39.863494 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Jan 24 00:44:39.863504 kernel: acpiphp: Slot [0-2] registered Jan 24 00:44:39.863646 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Jan 24 00:44:39.863654 kernel: acpiphp: Slot [0-3] registered Jan 24 00:44:39.863796 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Jan 24 00:44:39.863806 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Jan 24 00:44:39.863823 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Jan 24 00:44:39.863832 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Jan 24 00:44:39.863838 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Jan 24 00:44:39.863844 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Jan 24 00:44:39.863851 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Jan 24 00:44:39.863859 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Jan 24 00:44:39.863865 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Jan 24 00:44:39.863871 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Jan 24 00:44:39.863878 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Jan 24 00:44:39.863884 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Jan 24 00:44:39.863890 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Jan 24 00:44:39.863896 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Jan 24 00:44:39.863905 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Jan 24 00:44:39.863911 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Jan 24 00:44:39.863919 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Jan 24 00:44:39.863926 kernel: iommu: Default domain type: Translated Jan 24 00:44:39.863934 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Jan 24 00:44:39.863940 kernel: efivars: Registered efivars operations Jan 24 00:44:39.863946 kernel: PCI: Using ACPI for IRQ routing Jan 24 00:44:39.863953 kernel: PCI: pci_cache_line_size set to 64 bytes Jan 24 00:44:39.863960 kernel: e820: reserve RAM buffer [mem 0x7df35018-0x7fffffff] Jan 24 00:44:39.863966 kernel: e820: reserve RAM buffer [mem 0x7df71018-0x7fffffff] Jan 24 00:44:39.863972 kernel: e820: reserve RAM buffer [mem 0x7dfad018-0x7fffffff] Jan 24 00:44:39.863978 kernel: e820: reserve RAM buffer [mem 0x7ed3f000-0x7fffffff] Jan 24 00:44:39.863987 kernel: e820: reserve RAM buffer [mem 0x7f8ed000-0x7fffffff] Jan 24 00:44:39.863993 kernel: e820: reserve RAM buffer [mem 0x7ff7c000-0x7fffffff] Jan 24 00:44:39.863999 kernel: e820: reserve RAM buffer [mem 0x17a000000-0x17bffffff] Jan 24 00:44:39.864141 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Jan 24 00:44:39.864280 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Jan 24 00:44:39.864438 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Jan 24 00:44:39.864449 kernel: vgaarb: loaded Jan 24 00:44:39.864455 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 Jan 24 00:44:39.864462 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter Jan 24 00:44:39.864468 kernel: clocksource: Switched to clocksource kvm-clock Jan 24 00:44:39.864474 kernel: VFS: Disk quotas dquot_6.6.0 Jan 24 00:44:39.864481 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Jan 24 00:44:39.864494 kernel: pnp: PnP ACPI init Jan 24 00:44:39.864656 kernel: system 00:04: [mem 0xe0000000-0xefffffff window] has been reserved Jan 24 00:44:39.864665 kernel: pnp: PnP ACPI: found 5 devices Jan 24 00:44:39.864672 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Jan 24 00:44:39.864678 kernel: NET: Registered PF_INET protocol family Jan 24 00:44:39.864684 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Jan 24 00:44:39.864691 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Jan 24 00:44:39.864698 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Jan 24 00:44:39.864707 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Jan 24 00:44:39.864713 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Jan 24 00:44:39.864719 kernel: TCP: Hash tables configured (established 32768 bind 32768) Jan 24 00:44:39.864726 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Jan 24 00:44:39.864732 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Jan 24 00:44:39.864738 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Jan 24 00:44:39.864745 kernel: NET: Registered PF_XDP protocol family Jan 24 00:44:39.866080 kernel: pci 0000:01:00.0: ROM [mem 0xfff80000-0xffffffff pref]: can't claim; no compatible bridge window Jan 24 00:44:39.866235 kernel: pci 0000:07:00.0: ROM [mem 0xfff80000-0xffffffff pref]: can't claim; no compatible bridge window Jan 24 00:44:39.866380 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Jan 24 00:44:39.866555 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Jan 24 00:44:39.866698 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Jan 24 00:44:39.866840 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x1fff]: assigned Jan 24 00:44:39.866996 kernel: pci 0000:00:02.7: bridge window [io 0x2000-0x2fff]: assigned Jan 24 00:44:39.867137 kernel: pci 0000:00:03.0: bridge window [io 0x3000-0x3fff]: assigned Jan 24 00:44:39.867284 kernel: pci 0000:01:00.0: ROM [mem 0x81280000-0x812fffff pref]: assigned Jan 24 00:44:39.867438 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Jan 24 00:44:39.867591 kernel: pci 0000:00:02.0: bridge window [mem 0x81200000-0x812fffff] Jan 24 00:44:39.867732 kernel: pci 0000:00:02.0: bridge window [mem 0xc060000000-0xc0600fffff 64bit pref] Jan 24 00:44:39.867873 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Jan 24 00:44:39.868015 kernel: pci 0000:00:02.1: bridge window [mem 0x81100000-0x811fffff] Jan 24 00:44:39.868155 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Jan 24 00:44:39.868294 kernel: pci 0000:00:02.2: bridge window [mem 0x81000000-0x810fffff] Jan 24 00:44:39.868456 kernel: pci 0000:00:02.2: bridge window [mem 0xc060100000-0xc0601fffff 64bit pref] Jan 24 00:44:39.868606 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Jan 24 00:44:39.868746 kernel: pci 0000:00:02.3: bridge window [mem 0xc060200000-0xc0602fffff 64bit pref] Jan 24 00:44:39.868886 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Jan 24 00:44:39.869030 kernel: pci 0000:00:02.4: bridge window [mem 0x80f00000-0x80ffffff] Jan 24 00:44:39.869170 kernel: pci 0000:00:02.4: bridge window [mem 0xc060300000-0xc0603fffff 64bit pref] Jan 24 00:44:39.869311 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Jan 24 00:44:39.869465 kernel: pci 0000:00:02.5: bridge window [mem 0x80e00000-0x80efffff] Jan 24 00:44:39.869614 kernel: pci 0000:00:02.5: bridge window [mem 0xc060400000-0xc0604fffff 64bit pref] Jan 24 00:44:39.869758 kernel: pci 0000:07:00.0: ROM [mem 0x80c80000-0x80cfffff pref]: assigned Jan 24 00:44:39.869899 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Jan 24 00:44:39.870042 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x1fff] Jan 24 00:44:39.870181 kernel: pci 0000:00:02.6: bridge window [mem 0x80c00000-0x80dfffff] Jan 24 00:44:39.870321 kernel: pci 0000:00:02.6: bridge window [mem 0xc000000000-0xc01fffffff 64bit pref] Jan 24 00:44:39.870490 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Jan 24 00:44:39.870634 kernel: pci 0000:00:02.7: bridge window [io 0x2000-0x2fff] Jan 24 00:44:39.870773 kernel: pci 0000:00:02.7: bridge window [mem 0x80a00000-0x80bfffff] Jan 24 00:44:39.870920 kernel: pci 0000:00:02.7: bridge window [mem 0xc020000000-0xc03fffffff 64bit pref] Jan 24 00:44:39.871072 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Jan 24 00:44:39.871213 kernel: pci 0000:00:03.0: bridge window [io 0x3000-0x3fff] Jan 24 00:44:39.871352 kernel: pci 0000:00:03.0: bridge window [mem 0x80800000-0x809fffff] Jan 24 00:44:39.871512 kernel: pci 0000:00:03.0: bridge window [mem 0xc040000000-0xc05fffffff 64bit pref] Jan 24 00:44:39.871649 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Jan 24 00:44:39.871785 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Jan 24 00:44:39.871916 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Jan 24 00:44:39.872048 kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xdfffffff window] Jan 24 00:44:39.872179 kernel: pci_bus 0000:00: resource 8 [mem 0xf0000000-0xfebfffff window] Jan 24 00:44:39.872334 kernel: pci_bus 0000:00: resource 9 [mem 0xc000000000-0xc7ffffffff window] Jan 24 00:44:39.872510 kernel: pci_bus 0000:01: resource 1 [mem 0x81200000-0x812fffff] Jan 24 00:44:39.872655 kernel: pci_bus 0000:01: resource 2 [mem 0xc060000000-0xc0600fffff 64bit pref] Jan 24 00:44:39.872800 kernel: pci_bus 0000:02: resource 1 [mem 0x81100000-0x811fffff] Jan 24 00:44:39.872946 kernel: pci_bus 0000:03: resource 1 [mem 0x81000000-0x810fffff] Jan 24 00:44:39.873113 kernel: pci_bus 0000:03: resource 2 [mem 0xc060100000-0xc0601fffff 64bit pref] Jan 24 00:44:39.873258 kernel: pci_bus 0000:04: resource 2 [mem 0xc060200000-0xc0602fffff 64bit pref] Jan 24 00:44:39.873408 kernel: pci_bus 0000:05: resource 1 [mem 0x80f00000-0x80ffffff] Jan 24 00:44:39.873573 kernel: pci_bus 0000:05: resource 2 [mem 0xc060300000-0xc0603fffff 64bit pref] Jan 24 00:44:39.873741 kernel: pci_bus 0000:06: resource 1 [mem 0x80e00000-0x80efffff] Jan 24 00:44:39.873880 kernel: pci_bus 0000:06: resource 2 [mem 0xc060400000-0xc0604fffff 64bit pref] Jan 24 00:44:39.874023 kernel: pci_bus 0000:07: resource 0 [io 0x1000-0x1fff] Jan 24 00:44:39.874212 kernel: pci_bus 0000:07: resource 1 [mem 0x80c00000-0x80dfffff] Jan 24 00:44:39.874364 kernel: pci_bus 0000:07: resource 2 [mem 0xc000000000-0xc01fffffff 64bit pref] Jan 24 00:44:39.874531 kernel: pci_bus 0000:08: resource 0 [io 0x2000-0x2fff] Jan 24 00:44:39.874668 kernel: pci_bus 0000:08: resource 1 [mem 0x80a00000-0x80bfffff] Jan 24 00:44:39.874803 kernel: pci_bus 0000:08: resource 2 [mem 0xc020000000-0xc03fffffff 64bit pref] Jan 24 00:44:39.874953 kernel: pci_bus 0000:09: resource 0 [io 0x3000-0x3fff] Jan 24 00:44:39.875095 kernel: pci_bus 0000:09: resource 1 [mem 0x80800000-0x809fffff] Jan 24 00:44:39.875231 kernel: pci_bus 0000:09: resource 2 [mem 0xc040000000-0xc05fffffff 64bit pref] Jan 24 00:44:39.875240 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Jan 24 00:44:39.875247 kernel: PCI: CLS 0 bytes, default 64 Jan 24 00:44:39.875253 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Jan 24 00:44:39.875260 kernel: software IO TLB: mapped [mem 0x0000000077ffd000-0x000000007bffd000] (64MB) Jan 24 00:44:39.875269 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x229835b7123, max_idle_ns: 440795242976 ns Jan 24 00:44:39.875275 kernel: Initialise system trusted keyrings Jan 24 00:44:39.875281 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Jan 24 00:44:39.875288 kernel: Key type asymmetric registered Jan 24 00:44:39.875294 kernel: Asymmetric key parser 'x509' registered Jan 24 00:44:39.875300 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Jan 24 00:44:39.875307 kernel: io scheduler mq-deadline registered Jan 24 00:44:39.875315 kernel: io scheduler kyber registered Jan 24 00:44:39.875321 kernel: io scheduler bfq registered Jan 24 00:44:39.875480 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Jan 24 00:44:39.875647 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Jan 24 00:44:39.875791 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Jan 24 00:44:39.875931 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Jan 24 00:44:39.876072 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Jan 24 00:44:39.876216 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Jan 24 00:44:39.876356 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Jan 24 00:44:39.876519 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Jan 24 00:44:39.876663 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Jan 24 00:44:39.876804 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Jan 24 00:44:39.876944 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Jan 24 00:44:39.877084 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Jan 24 00:44:39.877227 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Jan 24 00:44:39.877366 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Jan 24 00:44:39.877526 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Jan 24 00:44:39.877667 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Jan 24 00:44:39.877676 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Jan 24 00:44:39.877818 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 32 Jan 24 00:44:39.877957 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 32 Jan 24 00:44:39.877965 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Jan 24 00:44:39.877971 kernel: ACPI: \_SB_.GSIF: Enabled at IRQ 21 Jan 24 00:44:39.877978 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Jan 24 00:44:39.877984 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Jan 24 00:44:39.877993 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Jan 24 00:44:39.877999 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Jan 24 00:44:39.878006 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Jan 24 00:44:39.878151 kernel: rtc_cmos 00:03: RTC can wake from S4 Jan 24 00:44:39.878160 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Jan 24 00:44:39.878294 kernel: rtc_cmos 00:03: registered as rtc0 Jan 24 00:44:39.878443 kernel: rtc_cmos 00:03: setting system clock to 2026-01-24T00:44:37 UTC (1769215477) Jan 24 00:44:39.878589 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram, hpet irqs Jan 24 00:44:39.878598 kernel: amd_pstate: The CPPC feature is supported but currently disabled by the BIOS. Please enable it if your BIOS has the CPPC option. Jan 24 00:44:39.878605 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled Jan 24 00:44:39.878611 kernel: efifb: probing for efifb Jan 24 00:44:39.878617 kernel: efifb: framebuffer at 0x80000000, using 4000k, total 4000k Jan 24 00:44:39.878624 kernel: efifb: mode is 1280x800x32, linelength=5120, pages=1 Jan 24 00:44:39.878633 kernel: efifb: scrolling: redraw Jan 24 00:44:39.878640 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Jan 24 00:44:39.878646 kernel: Console: switching to colour frame buffer device 160x50 Jan 24 00:44:39.878653 kernel: fb0: EFI VGA frame buffer device Jan 24 00:44:39.878659 kernel: pstore: Using crash dump compression: deflate Jan 24 00:44:39.878666 kernel: pstore: Registered efi_pstore as persistent store backend Jan 24 00:44:39.878691 kernel: NET: Registered PF_INET6 protocol family Jan 24 00:44:39.878723 kernel: Segment Routing with IPv6 Jan 24 00:44:39.878729 kernel: In-situ OAM (IOAM) with IPv6 Jan 24 00:44:39.878735 kernel: NET: Registered PF_PACKET protocol family Jan 24 00:44:39.878741 kernel: Key type dns_resolver registered Jan 24 00:44:39.878748 kernel: IPI shorthand broadcast: enabled Jan 24 00:44:39.878754 kernel: sched_clock: Marking stable (2227011186, 236916391)->(2494370448, -30442871) Jan 24 00:44:39.878761 kernel: registered taskstats version 1 Jan 24 00:44:39.878769 kernel: Loading compiled-in X.509 certificates Jan 24 00:44:39.878776 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.66-flatcar: 08600fac738f210e3b32f727339edfe2b1af2e3d' Jan 24 00:44:39.878782 kernel: Demotion targets for Node 0: null Jan 24 00:44:39.878788 kernel: Key type .fscrypt registered Jan 24 00:44:39.878794 kernel: Key type fscrypt-provisioning registered Jan 24 00:44:39.878801 kernel: ima: No TPM chip found, activating TPM-bypass! Jan 24 00:44:39.878808 kernel: ima: Allocated hash algorithm: sha1 Jan 24 00:44:39.878816 kernel: ima: No architecture policies found Jan 24 00:44:39.878822 kernel: clk: Disabling unused clocks Jan 24 00:44:39.878829 kernel: Freeing unused kernel image (initmem) memory: 15540K Jan 24 00:44:39.878837 kernel: Write protecting the kernel read-only data: 47104k Jan 24 00:44:39.878866 kernel: Freeing unused kernel image (rodata/data gap) memory: 1124K Jan 24 00:44:39.878872 kernel: Run /init as init process Jan 24 00:44:39.878879 kernel: with arguments: Jan 24 00:44:39.878888 kernel: /init Jan 24 00:44:39.878896 kernel: with environment: Jan 24 00:44:39.878906 kernel: HOME=/ Jan 24 00:44:39.878915 kernel: TERM=linux Jan 24 00:44:39.878925 kernel: ACPI: bus type USB registered Jan 24 00:44:39.878933 kernel: usbcore: registered new interface driver usbfs Jan 24 00:44:39.878939 kernel: usbcore: registered new interface driver hub Jan 24 00:44:39.878945 kernel: usbcore: registered new device driver usb Jan 24 00:44:39.879107 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Jan 24 00:44:39.879254 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Jan 24 00:44:39.879400 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Jan 24 00:44:39.879570 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Jan 24 00:44:39.879716 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Jan 24 00:44:39.879861 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Jan 24 00:44:39.880040 kernel: hub 1-0:1.0: USB hub found Jan 24 00:44:39.880199 kernel: hub 1-0:1.0: 4 ports detected Jan 24 00:44:39.880385 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Jan 24 00:44:39.880582 kernel: hub 2-0:1.0: USB hub found Jan 24 00:44:39.880739 kernel: hub 2-0:1.0: 4 ports detected Jan 24 00:44:39.880750 kernel: SCSI subsystem initialized Jan 24 00:44:39.880756 kernel: libata version 3.00 loaded. Jan 24 00:44:39.880898 kernel: ahci 0000:00:1f.2: version 3.0 Jan 24 00:44:39.880907 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Jan 24 00:44:39.881046 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode Jan 24 00:44:39.881186 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) Jan 24 00:44:39.881327 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Jan 24 00:44:39.881521 kernel: scsi host0: ahci Jan 24 00:44:39.881726 kernel: scsi host1: ahci Jan 24 00:44:39.881880 kernel: scsi host2: ahci Jan 24 00:44:39.882032 kernel: scsi host3: ahci Jan 24 00:44:39.882182 kernel: scsi host4: ahci Jan 24 00:44:39.882336 kernel: scsi host5: ahci Jan 24 00:44:39.882344 kernel: ata1: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380100 irq 38 lpm-pol 1 Jan 24 00:44:39.882351 kernel: ata2: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380180 irq 38 lpm-pol 1 Jan 24 00:44:39.882357 kernel: ata3: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380200 irq 38 lpm-pol 1 Jan 24 00:44:39.882364 kernel: ata4: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380280 irq 38 lpm-pol 1 Jan 24 00:44:39.882370 kernel: ata5: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380300 irq 38 lpm-pol 1 Jan 24 00:44:39.882379 kernel: ata6: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380380 irq 38 lpm-pol 1 Jan 24 00:44:39.882583 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Jan 24 00:44:39.882594 kernel: hid: raw HID events driver (C) Jiri Kosina Jan 24 00:44:39.882600 kernel: ata1: SATA link up 1.5 Gbps (SStatus 113 SControl 300) Jan 24 00:44:39.882607 kernel: ata3: SATA link down (SStatus 0 SControl 300) Jan 24 00:44:39.882613 kernel: ata5: SATA link down (SStatus 0 SControl 300) Jan 24 00:44:39.882619 kernel: ata6: SATA link down (SStatus 0 SControl 300) Jan 24 00:44:39.882629 kernel: ata1.00: LPM support broken, forcing max_power Jan 24 00:44:39.882636 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 Jan 24 00:44:39.882642 kernel: ata1.00: applying bridge limits Jan 24 00:44:39.882648 kernel: ata4: SATA link down (SStatus 0 SControl 300) Jan 24 00:44:39.882655 kernel: ata1.00: LPM support broken, forcing max_power Jan 24 00:44:39.882661 kernel: ata1.00: configured for UDMA/100 Jan 24 00:44:39.882667 kernel: ata2: SATA link down (SStatus 0 SControl 300) Jan 24 00:44:39.882837 kernel: scsi 0:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 Jan 24 00:44:39.882845 kernel: usbcore: registered new interface driver usbhid Jan 24 00:44:39.882851 kernel: usbhid: USB HID core driver Jan 24 00:44:39.883018 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray Jan 24 00:44:39.883027 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Jan 24 00:44:39.883182 kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0 Jan 24 00:44:39.883345 kernel: virtio_scsi virtio5: 2/0/0 default/read/poll queues Jan 24 00:44:39.883354 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input2 Jan 24 00:44:39.883527 kernel: scsi host6: Virtio SCSI HBA Jan 24 00:44:39.883708 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Jan 24 00:44:39.883880 kernel: scsi 6:0:0:0: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 Jan 24 00:44:39.884044 kernel: sd 6:0:0:0: Power-on or device reset occurred Jan 24 00:44:39.884204 kernel: sd 6:0:0:0: [sda] 160006144 512-byte logical blocks: (81.9 GB/76.3 GiB) Jan 24 00:44:39.884363 kernel: sd 6:0:0:0: [sda] Write Protect is off Jan 24 00:44:39.884553 kernel: sd 6:0:0:0: [sda] Mode Sense: 63 00 00 08 Jan 24 00:44:39.884713 kernel: sd 6:0:0:0: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Jan 24 00:44:39.884721 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Jan 24 00:44:39.884731 kernel: GPT:25804799 != 160006143 Jan 24 00:44:39.884737 kernel: GPT:Alternate GPT header not at the end of the disk. Jan 24 00:44:39.884744 kernel: GPT:25804799 != 160006143 Jan 24 00:44:39.884750 kernel: GPT: Use GNU Parted to correct GPT errors. Jan 24 00:44:39.884756 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jan 24 00:44:39.884916 kernel: sd 6:0:0:0: [sda] Attached SCSI disk Jan 24 00:44:39.884924 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Jan 24 00:44:39.884932 kernel: device-mapper: uevent: version 1.0.3 Jan 24 00:44:39.884939 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Jan 24 00:44:39.884945 kernel: device-mapper: verity: sha256 using shash "sha256-generic" Jan 24 00:44:39.884952 kernel: raid6: avx512x4 gen() 19473 MB/s Jan 24 00:44:39.884958 kernel: raid6: avx512x2 gen() 20887 MB/s Jan 24 00:44:39.884964 kernel: raid6: avx512x1 gen() 23988 MB/s Jan 24 00:44:39.884971 kernel: raid6: avx2x4 gen() 41358 MB/s Jan 24 00:44:39.884979 kernel: raid6: avx2x2 gen() 57216 MB/s Jan 24 00:44:39.884985 kernel: raid6: avx2x1 gen() 47519 MB/s Jan 24 00:44:39.884992 kernel: raid6: using algorithm avx2x2 gen() 57216 MB/s Jan 24 00:44:39.884998 kernel: raid6: .... xor() 36769 MB/s, rmw enabled Jan 24 00:44:39.885004 kernel: raid6: using avx512x2 recovery algorithm Jan 24 00:44:39.885010 kernel: xor: automatically using best checksumming function avx Jan 24 00:44:39.885017 kernel: Btrfs loaded, zoned=no, fsverity=no Jan 24 00:44:39.885023 kernel: BTRFS: device fsid 091bfa4a-922a-4e6e-abc1-a4b74083975f devid 1 transid 37 /dev/mapper/usr (254:0) scanned by mount (185) Jan 24 00:44:39.885032 kernel: BTRFS info (device dm-0): first mount of filesystem 091bfa4a-922a-4e6e-abc1-a4b74083975f Jan 24 00:44:39.885038 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Jan 24 00:44:39.885045 kernel: BTRFS info (device dm-0): enabling ssd optimizations Jan 24 00:44:39.885051 kernel: BTRFS info (device dm-0): disabling log replay at mount time Jan 24 00:44:39.885058 kernel: BTRFS info (device dm-0): enabling free space tree Jan 24 00:44:39.885064 kernel: loop: module loaded Jan 24 00:44:39.885070 kernel: loop0: detected capacity change from 0 to 100560 Jan 24 00:44:39.885078 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Jan 24 00:44:39.885086 systemd[1]: Successfully made /usr/ read-only. Jan 24 00:44:39.885094 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jan 24 00:44:39.885101 systemd[1]: Detected virtualization kvm. Jan 24 00:44:39.885108 systemd[1]: Detected architecture x86-64. Jan 24 00:44:39.885116 systemd[1]: Running in initrd. Jan 24 00:44:39.885123 systemd[1]: No hostname configured, using default hostname. Jan 24 00:44:39.885130 systemd[1]: Hostname set to . Jan 24 00:44:39.885136 systemd[1]: Initializing machine ID from SMBIOS/DMI UUID. Jan 24 00:44:39.885143 systemd[1]: Queued start job for default target initrd.target. Jan 24 00:44:39.885149 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Jan 24 00:44:39.885156 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 24 00:44:39.885165 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 24 00:44:39.885172 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Jan 24 00:44:39.885179 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 24 00:44:39.885186 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Jan 24 00:44:39.885193 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Jan 24 00:44:39.885201 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 24 00:44:39.885208 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 24 00:44:39.885215 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Jan 24 00:44:39.885222 systemd[1]: Reached target paths.target - Path Units. Jan 24 00:44:39.885228 systemd[1]: Reached target slices.target - Slice Units. Jan 24 00:44:39.885235 systemd[1]: Reached target swap.target - Swaps. Jan 24 00:44:39.885241 systemd[1]: Reached target timers.target - Timer Units. Jan 24 00:44:39.885250 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Jan 24 00:44:39.885257 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 24 00:44:39.885264 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Jan 24 00:44:39.885271 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Jan 24 00:44:39.885277 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Jan 24 00:44:39.885284 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 24 00:44:39.885291 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 24 00:44:39.885300 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 24 00:44:39.885306 systemd[1]: Reached target sockets.target - Socket Units. Jan 24 00:44:39.885313 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Jan 24 00:44:39.885320 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Jan 24 00:44:39.885326 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 24 00:44:39.885333 systemd[1]: Finished network-cleanup.service - Network Cleanup. Jan 24 00:44:39.885340 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Jan 24 00:44:39.885349 systemd[1]: Starting systemd-fsck-usr.service... Jan 24 00:44:39.885356 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 24 00:44:39.885363 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 24 00:44:39.885370 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 24 00:44:39.885379 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Jan 24 00:44:39.885385 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 24 00:44:39.885392 systemd[1]: Finished systemd-fsck-usr.service. Jan 24 00:44:39.885399 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 24 00:44:39.885441 systemd-journald[319]: Collecting audit messages is enabled. Jan 24 00:44:39.885460 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 24 00:44:39.885468 systemd-journald[319]: Journal started Jan 24 00:44:39.885482 systemd-journald[319]: Runtime Journal (/run/log/journal/8bb508c677ba42929113b30570784311) is 8M, max 76M, 68M free. Jan 24 00:44:39.899140 kernel: audit: type=1130 audit(1769215479.885:2): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:39.899316 systemd[1]: Started systemd-journald.service - Journal Service. Jan 24 00:44:39.885000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:39.903286 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Jan 24 00:44:39.909607 kernel: Bridge firewalling registered Jan 24 00:44:39.909639 kernel: audit: type=1130 audit(1769215479.902:3): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:39.902000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:39.909625 systemd-modules-load[323]: Inserted module 'br_netfilter' Jan 24 00:44:39.910329 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 24 00:44:39.910000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:39.911388 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 24 00:44:39.922593 kernel: audit: type=1130 audit(1769215479.910:4): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:39.922614 kernel: audit: type=1130 audit(1769215479.916:5): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:39.916000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:39.923288 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 24 00:44:39.925538 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 24 00:44:39.927551 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 24 00:44:39.932804 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 24 00:44:39.946000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:39.946542 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 24 00:44:39.952659 kernel: audit: type=1130 audit(1769215479.946:6): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:39.954591 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Jan 24 00:44:39.956185 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 24 00:44:39.963482 kernel: audit: type=1130 audit(1769215479.957:7): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:39.957000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:39.957758 systemd-tmpfiles[340]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Jan 24 00:44:39.969752 kernel: audit: type=1130 audit(1769215479.963:8): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:39.963000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:39.958226 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 24 00:44:39.974759 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 24 00:44:39.975000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:39.979535 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 24 00:44:39.982480 kernel: audit: type=1130 audit(1769215479.975:9): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:39.976000 audit: BPF prog-id=6 op=LOAD Jan 24 00:44:39.985462 kernel: audit: type=1334 audit(1769215479.976:10): prog-id=6 op=LOAD Jan 24 00:44:39.992081 dracut-cmdline[354]: dracut-109 Jan 24 00:44:39.995380 dracut-cmdline[354]: Using kernel command line parameters: SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=ccc6714d5701627f00a0daea097f593263f2ea87c850869ae25db66d36e22877 Jan 24 00:44:40.022958 systemd-resolved[358]: Positive Trust Anchors: Jan 24 00:44:40.022970 systemd-resolved[358]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 24 00:44:40.022974 systemd-resolved[358]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Jan 24 00:44:40.022995 systemd-resolved[358]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 24 00:44:40.045773 systemd-resolved[358]: Defaulting to hostname 'linux'. Jan 24 00:44:40.047033 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 24 00:44:40.047000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:40.047955 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 24 00:44:40.087477 kernel: Loading iSCSI transport class v2.0-870. Jan 24 00:44:40.101465 kernel: iscsi: registered transport (tcp) Jan 24 00:44:40.122015 kernel: iscsi: registered transport (qla4xxx) Jan 24 00:44:40.122080 kernel: QLogic iSCSI HBA Driver Jan 24 00:44:40.149144 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 24 00:44:40.184023 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 24 00:44:40.186000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:40.188708 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 24 00:44:40.280970 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Jan 24 00:44:40.281000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:40.284845 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Jan 24 00:44:40.289693 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Jan 24 00:44:40.334893 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Jan 24 00:44:40.335000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:40.338000 audit: BPF prog-id=7 op=LOAD Jan 24 00:44:40.338000 audit: BPF prog-id=8 op=LOAD Jan 24 00:44:40.340660 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 24 00:44:40.386100 systemd-udevd[587]: Using default interface naming scheme 'v257'. Jan 24 00:44:40.400000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:40.400357 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 24 00:44:40.404547 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Jan 24 00:44:40.427537 dracut-pre-trigger[652]: rd.md=0: removing MD RAID activation Jan 24 00:44:40.432936 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 24 00:44:40.434000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:40.435000 audit: BPF prog-id=9 op=LOAD Jan 24 00:44:40.437584 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 24 00:44:40.454000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:40.454318 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Jan 24 00:44:40.457552 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 24 00:44:40.477846 systemd-networkd[706]: lo: Link UP Jan 24 00:44:40.478000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:40.477852 systemd-networkd[706]: lo: Gained carrier Jan 24 00:44:40.478829 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 24 00:44:40.479280 systemd[1]: Reached target network.target - Network. Jan 24 00:44:40.564900 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 24 00:44:40.567000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:40.571256 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Jan 24 00:44:40.714227 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. Jan 24 00:44:40.731106 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. Jan 24 00:44:40.758468 kernel: cryptd: max_cpu_qlen set to 1000 Jan 24 00:44:40.774340 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. Jan 24 00:44:40.778716 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Jan 24 00:44:40.783502 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input3 Jan 24 00:44:40.793403 systemd-networkd[706]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 24 00:44:40.793434 systemd-networkd[706]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 24 00:44:40.794152 systemd-networkd[706]: eth0: Link UP Jan 24 00:44:40.794568 systemd-networkd[706]: eth0: Gained carrier Jan 24 00:44:40.794577 systemd-networkd[706]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 24 00:44:40.797350 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Jan 24 00:44:40.805283 kernel: AES CTR mode by8 optimization enabled Jan 24 00:44:40.808574 disk-uuid[773]: Primary Header is updated. Jan 24 00:44:40.808574 disk-uuid[773]: Secondary Entries is updated. Jan 24 00:44:40.808574 disk-uuid[773]: Secondary Header is updated. Jan 24 00:44:40.810603 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 24 00:44:40.810690 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 24 00:44:40.812527 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jan 24 00:44:40.841179 kernel: kauditd_printk_skb: 12 callbacks suppressed Jan 24 00:44:40.841201 kernel: audit: type=1131 audit(1769215480.812:23): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:40.812000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:40.846276 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 24 00:44:40.856479 systemd-networkd[706]: eth0: DHCPv4 address 65.109.167.77/32, gateway 172.31.1.1 acquired from 172.31.1.1 Jan 24 00:44:40.863314 systemd-networkd[706]: eth1: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 24 00:44:40.863321 systemd-networkd[706]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 24 00:44:40.863718 systemd-networkd[706]: eth1: Link UP Jan 24 00:44:40.864524 systemd-networkd[706]: eth1: Gained carrier Jan 24 00:44:40.864533 systemd-networkd[706]: eth1: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 24 00:44:40.866371 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 24 00:44:40.871776 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 24 00:44:40.895583 kernel: audit: type=1130 audit(1769215480.872:24): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:40.895605 kernel: audit: type=1131 audit(1769215480.872:25): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:40.872000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:40.872000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:40.886649 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 24 00:44:40.920124 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 24 00:44:40.926881 kernel: audit: type=1130 audit(1769215480.920:26): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:40.920000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:40.931836 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Jan 24 00:44:40.937631 kernel: audit: type=1130 audit(1769215480.931:27): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:40.931000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:40.932725 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Jan 24 00:44:40.938003 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 24 00:44:40.938747 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 24 00:44:40.940451 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Jan 24 00:44:40.958905 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Jan 24 00:44:40.964711 kernel: audit: type=1130 audit(1769215480.958:28): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:40.958000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:41.130533 systemd-networkd[706]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Jan 24 00:44:41.867709 disk-uuid[775]: Warning: The kernel is still using the old partition table. Jan 24 00:44:41.867709 disk-uuid[775]: The new table will be used at the next reboot or after you Jan 24 00:44:41.867709 disk-uuid[775]: run partprobe(8) or kpartx(8) Jan 24 00:44:41.867709 disk-uuid[775]: The operation has completed successfully. Jan 24 00:44:41.882270 systemd[1]: disk-uuid.service: Deactivated successfully. Jan 24 00:44:41.882558 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Jan 24 00:44:41.910051 kernel: audit: type=1130 audit(1769215481.884:29): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:41.910107 kernel: audit: type=1131 audit(1769215481.884:30): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:41.884000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:41.884000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:41.887733 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Jan 24 00:44:41.972488 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (889) Jan 24 00:44:41.979980 kernel: BTRFS info (device sda6): first mount of filesystem 98bf19d7-5744-4291-8d20-e6403ff726cc Jan 24 00:44:41.980070 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jan 24 00:44:41.995922 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 24 00:44:41.995986 kernel: BTRFS info (device sda6): turning on async discard Jan 24 00:44:41.996010 kernel: BTRFS info (device sda6): enabling free space tree Jan 24 00:44:42.015466 kernel: BTRFS info (device sda6): last unmount of filesystem 98bf19d7-5744-4291-8d20-e6403ff726cc Jan 24 00:44:42.016769 systemd[1]: Finished ignition-setup.service - Ignition (setup). Jan 24 00:44:42.031406 kernel: audit: type=1130 audit(1769215482.017:31): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:42.017000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:42.021783 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Jan 24 00:44:42.231082 ignition[908]: Ignition 2.24.0 Jan 24 00:44:42.231094 ignition[908]: Stage: fetch-offline Jan 24 00:44:42.231134 ignition[908]: no configs at "/usr/lib/ignition/base.d" Jan 24 00:44:42.231144 ignition[908]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jan 24 00:44:42.248981 kernel: audit: type=1130 audit(1769215482.234:32): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:42.234000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:42.234237 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Jan 24 00:44:42.232388 ignition[908]: parsed url from cmdline: "" Jan 24 00:44:42.237548 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Jan 24 00:44:42.232393 ignition[908]: no config URL provided Jan 24 00:44:42.232398 ignition[908]: reading system config file "/usr/lib/ignition/user.ign" Jan 24 00:44:42.232448 ignition[908]: no config at "/usr/lib/ignition/user.ign" Jan 24 00:44:42.232453 ignition[908]: failed to fetch config: resource requires networking Jan 24 00:44:42.232891 ignition[908]: Ignition finished successfully Jan 24 00:44:42.263955 ignition[918]: Ignition 2.24.0 Jan 24 00:44:42.263977 ignition[918]: Stage: fetch Jan 24 00:44:42.264947 systemd-networkd[706]: eth1: Gained IPv6LL Jan 24 00:44:42.264076 ignition[918]: no configs at "/usr/lib/ignition/base.d" Jan 24 00:44:42.264085 ignition[918]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jan 24 00:44:42.264139 ignition[918]: parsed url from cmdline: "" Jan 24 00:44:42.264142 ignition[918]: no config URL provided Jan 24 00:44:42.264147 ignition[918]: reading system config file "/usr/lib/ignition/user.ign" Jan 24 00:44:42.264153 ignition[918]: no config at "/usr/lib/ignition/user.ign" Jan 24 00:44:42.264181 ignition[918]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 Jan 24 00:44:42.274205 ignition[918]: GET result: OK Jan 24 00:44:42.274264 ignition[918]: parsing config with SHA512: 25cf7cb798ac9f36b1f8b050ce71999fb1e4a28db18afabd68bb43c6157f36632152819fdcb42d63afe82d8ba3885bb7da608931dfd616b84c46763eb3760a54 Jan 24 00:44:42.279180 unknown[918]: fetched base config from "system" Jan 24 00:44:42.279187 unknown[918]: fetched base config from "system" Jan 24 00:44:42.279387 ignition[918]: fetch: fetch complete Jan 24 00:44:42.279192 unknown[918]: fetched user config from "hetzner" Jan 24 00:44:42.279391 ignition[918]: fetch: fetch passed Jan 24 00:44:42.284281 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Jan 24 00:44:42.285000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:42.279438 ignition[918]: Ignition finished successfully Jan 24 00:44:42.286350 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Jan 24 00:44:42.331819 ignition[925]: Ignition 2.24.0 Jan 24 00:44:42.332493 ignition[925]: Stage: kargs Jan 24 00:44:42.332643 ignition[925]: no configs at "/usr/lib/ignition/base.d" Jan 24 00:44:42.332651 ignition[925]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jan 24 00:44:42.335529 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Jan 24 00:44:42.336000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:42.333246 ignition[925]: kargs: kargs passed Jan 24 00:44:42.333281 ignition[925]: Ignition finished successfully Jan 24 00:44:42.338660 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Jan 24 00:44:42.358457 ignition[931]: Ignition 2.24.0 Jan 24 00:44:42.358466 ignition[931]: Stage: disks Jan 24 00:44:42.358581 ignition[931]: no configs at "/usr/lib/ignition/base.d" Jan 24 00:44:42.361000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:42.361491 systemd[1]: Finished ignition-disks.service - Ignition (disks). Jan 24 00:44:42.358589 ignition[931]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jan 24 00:44:42.359194 ignition[931]: disks: disks passed Jan 24 00:44:42.359229 ignition[931]: Ignition finished successfully Jan 24 00:44:42.363563 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Jan 24 00:44:42.364539 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Jan 24 00:44:42.365444 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 24 00:44:42.366416 systemd[1]: Reached target sysinit.target - System Initialization. Jan 24 00:44:42.367361 systemd[1]: Reached target basic.target - Basic System. Jan 24 00:44:42.370176 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Jan 24 00:44:42.410912 systemd-fsck[940]: ROOT: clean, 15/1631200 files, 112378/1617920 blocks Jan 24 00:44:42.413436 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Jan 24 00:44:42.413000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:42.415534 systemd[1]: Mounting sysroot.mount - /sysroot... Jan 24 00:44:42.543071 kernel: EXT4-fs (sda9): mounted filesystem 4e30a7d6-83d2-471c-98e0-68a57c0656af r/w with ordered data mode. Quota mode: none. Jan 24 00:44:42.543258 systemd[1]: Mounted sysroot.mount - /sysroot. Jan 24 00:44:42.544071 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Jan 24 00:44:42.546061 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 24 00:44:42.547401 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Jan 24 00:44:42.552580 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Jan 24 00:44:42.553537 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Jan 24 00:44:42.553598 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Jan 24 00:44:42.560726 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Jan 24 00:44:42.567538 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Jan 24 00:44:42.577530 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (948) Jan 24 00:44:42.584687 kernel: BTRFS info (device sda6): first mount of filesystem 98bf19d7-5744-4291-8d20-e6403ff726cc Jan 24 00:44:42.584728 systemd-networkd[706]: eth0: Gained IPv6LL Jan 24 00:44:42.591469 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jan 24 00:44:42.613280 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 24 00:44:42.613366 kernel: BTRFS info (device sda6): turning on async discard Jan 24 00:44:42.613467 kernel: BTRFS info (device sda6): enabling free space tree Jan 24 00:44:42.620240 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 24 00:44:42.710897 coreos-metadata[950]: Jan 24 00:44:42.710 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 Jan 24 00:44:42.713793 coreos-metadata[950]: Jan 24 00:44:42.712 INFO Fetch successful Jan 24 00:44:42.716660 coreos-metadata[950]: Jan 24 00:44:42.714 INFO wrote hostname ci-4593-0-0-9-1308b066bf to /sysroot/etc/hostname Jan 24 00:44:42.721208 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Jan 24 00:44:42.722000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-metadata-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:42.941002 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Jan 24 00:44:42.941000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:42.945566 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Jan 24 00:44:42.948691 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Jan 24 00:44:42.977941 systemd[1]: sysroot-oem.mount: Deactivated successfully. Jan 24 00:44:42.983530 kernel: BTRFS info (device sda6): last unmount of filesystem 98bf19d7-5744-4291-8d20-e6403ff726cc Jan 24 00:44:43.018847 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Jan 24 00:44:43.020000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:43.029814 ignition[1050]: INFO : Ignition 2.24.0 Jan 24 00:44:43.031402 ignition[1050]: INFO : Stage: mount Jan 24 00:44:43.031402 ignition[1050]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 24 00:44:43.031402 ignition[1050]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jan 24 00:44:43.036786 ignition[1050]: INFO : mount: mount passed Jan 24 00:44:43.036786 ignition[1050]: INFO : Ignition finished successfully Jan 24 00:44:43.038000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:43.036906 systemd[1]: Finished ignition-mount.service - Ignition (mount). Jan 24 00:44:43.041164 systemd[1]: Starting ignition-files.service - Ignition (files)... Jan 24 00:44:43.071461 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 24 00:44:43.118995 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (1060) Jan 24 00:44:43.119069 kernel: BTRFS info (device sda6): first mount of filesystem 98bf19d7-5744-4291-8d20-e6403ff726cc Jan 24 00:44:43.125053 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jan 24 00:44:43.139084 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 24 00:44:43.139142 kernel: BTRFS info (device sda6): turning on async discard Jan 24 00:44:43.143147 kernel: BTRFS info (device sda6): enabling free space tree Jan 24 00:44:43.150818 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 24 00:44:43.195653 ignition[1077]: INFO : Ignition 2.24.0 Jan 24 00:44:43.195653 ignition[1077]: INFO : Stage: files Jan 24 00:44:43.198538 ignition[1077]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 24 00:44:43.198538 ignition[1077]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jan 24 00:44:43.198538 ignition[1077]: DEBUG : files: compiled without relabeling support, skipping Jan 24 00:44:43.201499 ignition[1077]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Jan 24 00:44:43.201499 ignition[1077]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Jan 24 00:44:43.208808 ignition[1077]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Jan 24 00:44:43.210268 ignition[1077]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Jan 24 00:44:43.211110 ignition[1077]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Jan 24 00:44:43.210574 unknown[1077]: wrote ssh authorized keys file for user: core Jan 24 00:44:43.213301 ignition[1077]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Jan 24 00:44:43.213301 ignition[1077]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-amd64.tar.gz: attempt #1 Jan 24 00:44:43.425458 ignition[1077]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Jan 24 00:44:43.728180 ignition[1077]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Jan 24 00:44:43.728180 ignition[1077]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Jan 24 00:44:43.730838 ignition[1077]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Jan 24 00:44:43.730838 ignition[1077]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Jan 24 00:44:43.730838 ignition[1077]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Jan 24 00:44:43.730838 ignition[1077]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 24 00:44:43.730838 ignition[1077]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 24 00:44:43.730838 ignition[1077]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 24 00:44:43.730838 ignition[1077]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 24 00:44:43.736614 ignition[1077]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Jan 24 00:44:43.736614 ignition[1077]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Jan 24 00:44:43.736614 ignition[1077]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Jan 24 00:44:43.736614 ignition[1077]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Jan 24 00:44:43.736614 ignition[1077]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Jan 24 00:44:43.736614 ignition[1077]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.0-x86-64.raw: attempt #1 Jan 24 00:44:44.133615 ignition[1077]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Jan 24 00:44:44.530646 ignition[1077]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Jan 24 00:44:44.530646 ignition[1077]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Jan 24 00:44:44.534225 ignition[1077]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 24 00:44:44.538385 ignition[1077]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 24 00:44:44.538385 ignition[1077]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Jan 24 00:44:44.538385 ignition[1077]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Jan 24 00:44:44.541880 ignition[1077]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Jan 24 00:44:44.541880 ignition[1077]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Jan 24 00:44:44.541880 ignition[1077]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Jan 24 00:44:44.541880 ignition[1077]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" Jan 24 00:44:44.541880 ignition[1077]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" Jan 24 00:44:44.541880 ignition[1077]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" Jan 24 00:44:44.541880 ignition[1077]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" Jan 24 00:44:44.541880 ignition[1077]: INFO : files: files passed Jan 24 00:44:44.541880 ignition[1077]: INFO : Ignition finished successfully Jan 24 00:44:44.545000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:44.544254 systemd[1]: Finished ignition-files.service - Ignition (files). Jan 24 00:44:44.549716 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Jan 24 00:44:44.558724 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Jan 24 00:44:44.576373 systemd[1]: ignition-quench.service: Deactivated successfully. Jan 24 00:44:44.577008 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Jan 24 00:44:44.579000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:44.579000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:44.593683 initrd-setup-root-after-ignition[1109]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 24 00:44:44.593683 initrd-setup-root-after-ignition[1109]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Jan 24 00:44:44.598020 initrd-setup-root-after-ignition[1113]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 24 00:44:44.601747 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 24 00:44:44.602000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:44.603193 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Jan 24 00:44:44.606363 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Jan 24 00:44:44.681360 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Jan 24 00:44:44.681701 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Jan 24 00:44:44.683000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:44.683000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:44.684724 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Jan 24 00:44:44.686230 systemd[1]: Reached target initrd.target - Initrd Default Target. Jan 24 00:44:44.689823 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Jan 24 00:44:44.691972 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Jan 24 00:44:44.738247 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 24 00:44:44.739000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:44.744291 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Jan 24 00:44:44.778936 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Jan 24 00:44:44.779182 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Jan 24 00:44:44.780573 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 24 00:44:44.782686 systemd[1]: Stopped target timers.target - Timer Units. Jan 24 00:44:44.786000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:44.784704 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Jan 24 00:44:44.785063 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 24 00:44:44.787585 systemd[1]: Stopped target initrd.target - Initrd Default Target. Jan 24 00:44:44.789653 systemd[1]: Stopped target basic.target - Basic System. Jan 24 00:44:44.792255 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Jan 24 00:44:44.794242 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Jan 24 00:44:44.796130 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Jan 24 00:44:44.798204 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Jan 24 00:44:44.800066 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Jan 24 00:44:44.801911 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Jan 24 00:44:44.804066 systemd[1]: Stopped target sysinit.target - System Initialization. Jan 24 00:44:44.805997 systemd[1]: Stopped target local-fs.target - Local File Systems. Jan 24 00:44:44.807833 systemd[1]: Stopped target swap.target - Swaps. Jan 24 00:44:44.811000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:44.809797 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Jan 24 00:44:44.810068 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Jan 24 00:44:44.812646 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Jan 24 00:44:44.814602 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 24 00:44:44.816397 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Jan 24 00:44:44.817538 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 24 00:44:44.820000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:44.819196 systemd[1]: dracut-initqueue.service: Deactivated successfully. Jan 24 00:44:44.819399 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Jan 24 00:44:44.823000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:44.822107 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Jan 24 00:44:44.825000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:44.822410 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 24 00:44:44.827000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-metadata-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:44.824191 systemd[1]: ignition-files.service: Deactivated successfully. Jan 24 00:44:44.824377 systemd[1]: Stopped ignition-files.service - Ignition (files). Jan 24 00:44:44.826374 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Jan 24 00:44:44.826706 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Jan 24 00:44:44.831810 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Jan 24 00:44:44.832850 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Jan 24 00:44:44.833064 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Jan 24 00:44:44.837000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:44.840252 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Jan 24 00:44:44.843383 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Jan 24 00:44:44.844748 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 24 00:44:44.846000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:44.847127 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Jan 24 00:44:44.848340 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Jan 24 00:44:44.849000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:44.850560 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Jan 24 00:44:44.851619 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Jan 24 00:44:44.853000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:44.864031 systemd[1]: initrd-cleanup.service: Deactivated successfully. Jan 24 00:44:44.864226 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Jan 24 00:44:44.866000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:44.866000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:44.878997 ignition[1133]: INFO : Ignition 2.24.0 Jan 24 00:44:44.878997 ignition[1133]: INFO : Stage: umount Jan 24 00:44:44.882356 ignition[1133]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 24 00:44:44.882356 ignition[1133]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jan 24 00:44:44.886190 ignition[1133]: INFO : umount: umount passed Jan 24 00:44:44.886190 ignition[1133]: INFO : Ignition finished successfully Jan 24 00:44:44.885982 systemd[1]: ignition-mount.service: Deactivated successfully. Jan 24 00:44:44.886964 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Jan 24 00:44:44.889000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:44.889849 systemd[1]: ignition-disks.service: Deactivated successfully. Jan 24 00:44:44.889929 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Jan 24 00:44:44.892000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:44.892622 systemd[1]: ignition-kargs.service: Deactivated successfully. Jan 24 00:44:44.893000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:44.894000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:44.892703 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Jan 24 00:44:44.894157 systemd[1]: ignition-fetch.service: Deactivated successfully. Jan 24 00:44:44.894240 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Jan 24 00:44:44.894978 systemd[1]: Stopped target network.target - Network. Jan 24 00:44:44.898572 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Jan 24 00:44:44.899277 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Jan 24 00:44:44.900000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:44.900842 systemd[1]: Stopped target paths.target - Path Units. Jan 24 00:44:44.902282 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Jan 24 00:44:44.907703 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 24 00:44:44.909062 systemd[1]: Stopped target slices.target - Slice Units. Jan 24 00:44:44.910497 systemd[1]: Stopped target sockets.target - Socket Units. Jan 24 00:44:44.911918 systemd[1]: iscsid.socket: Deactivated successfully. Jan 24 00:44:44.911993 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Jan 24 00:44:44.913367 systemd[1]: iscsiuio.socket: Deactivated successfully. Jan 24 00:44:44.913467 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 24 00:44:44.915659 systemd[1]: systemd-journald-audit.socket: Deactivated successfully. Jan 24 00:44:44.915723 systemd[1]: Closed systemd-journald-audit.socket - Journal Audit Socket. Jan 24 00:44:44.917000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:44.916854 systemd[1]: ignition-setup.service: Deactivated successfully. Jan 24 00:44:44.918000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup-pre comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:44.916942 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Jan 24 00:44:44.918152 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Jan 24 00:44:44.918222 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Jan 24 00:44:44.919503 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Jan 24 00:44:44.920734 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Jan 24 00:44:44.925174 systemd[1]: sysroot-boot.mount: Deactivated successfully. Jan 24 00:44:44.926208 systemd[1]: sysroot-boot.service: Deactivated successfully. Jan 24 00:44:44.926375 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Jan 24 00:44:44.929000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:44.930316 systemd[1]: initrd-setup-root.service: Deactivated successfully. Jan 24 00:44:44.930713 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Jan 24 00:44:44.931000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:44.932911 systemd[1]: systemd-resolved.service: Deactivated successfully. Jan 24 00:44:44.933106 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Jan 24 00:44:44.933000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:44.938000 audit: BPF prog-id=6 op=UNLOAD Jan 24 00:44:44.940229 systemd[1]: systemd-networkd.service: Deactivated successfully. Jan 24 00:44:44.940554 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Jan 24 00:44:44.941000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:44.943000 audit: BPF prog-id=9 op=UNLOAD Jan 24 00:44:44.945163 systemd[1]: Stopped target network-pre.target - Preparation for Network. Jan 24 00:44:44.946002 systemd[1]: systemd-networkd.socket: Deactivated successfully. Jan 24 00:44:44.946063 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Jan 24 00:44:44.948616 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Jan 24 00:44:44.950814 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Jan 24 00:44:44.952000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:44.953000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:44.950899 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 24 00:44:44.956000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:44.952722 systemd[1]: systemd-sysctl.service: Deactivated successfully. Jan 24 00:44:44.952840 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Jan 24 00:44:44.953752 systemd[1]: systemd-modules-load.service: Deactivated successfully. Jan 24 00:44:44.953850 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Jan 24 00:44:44.956700 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 24 00:44:44.974878 systemd[1]: systemd-udevd.service: Deactivated successfully. Jan 24 00:44:44.975201 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 24 00:44:44.977000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:44.978991 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Jan 24 00:44:44.979604 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Jan 24 00:44:44.981273 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Jan 24 00:44:44.981339 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Jan 24 00:44:44.982102 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Jan 24 00:44:44.984000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:44.985000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:44.982180 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Jan 24 00:44:44.984974 systemd[1]: dracut-cmdline.service: Deactivated successfully. Jan 24 00:44:44.986000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:44.985053 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Jan 24 00:44:44.985928 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 24 00:44:44.986008 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 24 00:44:44.992000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:44.988059 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Jan 24 00:44:44.992246 systemd[1]: systemd-network-generator.service: Deactivated successfully. Jan 24 00:44:44.992383 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Jan 24 00:44:44.996000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:44.995555 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Jan 24 00:44:44.998000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:44.995636 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 24 00:44:44.996940 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 24 00:44:44.997018 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 24 00:44:45.014127 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Jan 24 00:44:45.019665 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Jan 24 00:44:45.022000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:45.022000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:45.025689 systemd[1]: network-cleanup.service: Deactivated successfully. Jan 24 00:44:45.025877 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Jan 24 00:44:45.027000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=network-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:45.028138 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Jan 24 00:44:45.030711 systemd[1]: Starting initrd-switch-root.service - Switch Root... Jan 24 00:44:45.069866 systemd[1]: Switching root. Jan 24 00:44:45.112872 systemd-journald[319]: Journal stopped Jan 24 00:44:47.014339 systemd-journald[319]: Received SIGTERM from PID 1 (systemd). Jan 24 00:44:47.014409 kernel: SELinux: policy capability network_peer_controls=1 Jan 24 00:44:47.014469 kernel: SELinux: policy capability open_perms=1 Jan 24 00:44:47.014480 kernel: SELinux: policy capability extended_socket_class=1 Jan 24 00:44:47.014490 kernel: SELinux: policy capability always_check_network=0 Jan 24 00:44:47.014506 kernel: SELinux: policy capability cgroup_seclabel=1 Jan 24 00:44:47.014519 kernel: SELinux: policy capability nnp_nosuid_transition=1 Jan 24 00:44:47.014528 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Jan 24 00:44:47.014537 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Jan 24 00:44:47.014577 kernel: SELinux: policy capability userspace_initial_context=0 Jan 24 00:44:47.014586 systemd[1]: Successfully loaded SELinux policy in 79.143ms. Jan 24 00:44:47.014605 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 16.945ms. Jan 24 00:44:47.014615 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jan 24 00:44:47.014625 systemd[1]: Detected virtualization kvm. Jan 24 00:44:47.014638 systemd[1]: Detected architecture x86-64. Jan 24 00:44:47.014648 systemd[1]: Detected first boot. Jan 24 00:44:47.014662 systemd[1]: Hostname set to . Jan 24 00:44:47.014675 systemd[1]: Initializing machine ID from SMBIOS/DMI UUID. Jan 24 00:44:47.014685 zram_generator::config[1176]: No configuration found. Jan 24 00:44:47.014699 kernel: Guest personality initialized and is inactive Jan 24 00:44:47.014712 kernel: VMCI host device registered (name=vmci, major=10, minor=258) Jan 24 00:44:47.014721 kernel: Initialized host personality Jan 24 00:44:47.014732 kernel: NET: Registered PF_VSOCK protocol family Jan 24 00:44:47.014742 systemd[1]: Populated /etc with preset unit settings. Jan 24 00:44:47.014751 kernel: kauditd_printk_skb: 59 callbacks suppressed Jan 24 00:44:47.014762 kernel: audit: type=1334 audit(1769215486.504:92): prog-id=12 op=LOAD Jan 24 00:44:47.014771 kernel: audit: type=1334 audit(1769215486.504:93): prog-id=3 op=UNLOAD Jan 24 00:44:47.014780 kernel: audit: type=1334 audit(1769215486.505:94): prog-id=13 op=LOAD Jan 24 00:44:47.014789 systemd[1]: initrd-switch-root.service: Deactivated successfully. Jan 24 00:44:47.014801 kernel: audit: type=1334 audit(1769215486.505:95): prog-id=14 op=LOAD Jan 24 00:44:47.014810 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Jan 24 00:44:47.014820 kernel: audit: type=1334 audit(1769215486.505:96): prog-id=4 op=UNLOAD Jan 24 00:44:47.014829 kernel: audit: type=1334 audit(1769215486.505:97): prog-id=5 op=UNLOAD Jan 24 00:44:47.014838 kernel: audit: type=1131 audit(1769215486.510:98): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:47.014847 kernel: audit: type=1334 audit(1769215486.536:99): prog-id=12 op=UNLOAD Jan 24 00:44:47.014856 kernel: audit: type=1130 audit(1769215486.548:100): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=initrd-switch-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:47.014868 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Jan 24 00:44:47.014878 kernel: audit: type=1131 audit(1769215486.548:101): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=initrd-switch-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:47.014892 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Jan 24 00:44:47.014902 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Jan 24 00:44:47.014911 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Jan 24 00:44:47.014923 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Jan 24 00:44:47.014933 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Jan 24 00:44:47.014943 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Jan 24 00:44:47.014952 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Jan 24 00:44:47.014962 systemd[1]: Created slice user.slice - User and Session Slice. Jan 24 00:44:47.014971 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 24 00:44:47.014983 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 24 00:44:47.014993 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Jan 24 00:44:47.015003 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Jan 24 00:44:47.015012 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Jan 24 00:44:47.015022 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 24 00:44:47.015034 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Jan 24 00:44:47.015045 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 24 00:44:47.015055 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 24 00:44:47.015065 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Jan 24 00:44:47.015074 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Jan 24 00:44:47.015084 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Jan 24 00:44:47.015093 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Jan 24 00:44:47.015102 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 24 00:44:47.015112 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 24 00:44:47.015124 systemd[1]: Reached target remote-veritysetup.target - Remote Verity Protected Volumes. Jan 24 00:44:47.015133 systemd[1]: Reached target slices.target - Slice Units. Jan 24 00:44:47.015143 systemd[1]: Reached target swap.target - Swaps. Jan 24 00:44:47.015152 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Jan 24 00:44:47.015162 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Jan 24 00:44:47.015171 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Jan 24 00:44:47.015181 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Jan 24 00:44:47.015193 systemd[1]: Listening on systemd-mountfsd.socket - DDI File System Mounter Socket. Jan 24 00:44:47.015202 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 24 00:44:47.015212 systemd[1]: Listening on systemd-nsresourced.socket - Namespace Resource Manager Socket. Jan 24 00:44:47.015221 systemd[1]: Listening on systemd-oomd.socket - Userspace Out-Of-Memory (OOM) Killer Socket. Jan 24 00:44:47.015231 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 24 00:44:47.015241 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 24 00:44:47.015251 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Jan 24 00:44:47.015263 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Jan 24 00:44:47.015272 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Jan 24 00:44:47.015282 systemd[1]: Mounting media.mount - External Media Directory... Jan 24 00:44:47.015291 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 24 00:44:47.015301 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Jan 24 00:44:47.015310 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Jan 24 00:44:47.015320 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Jan 24 00:44:47.015331 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Jan 24 00:44:47.015345 systemd[1]: Reached target machines.target - Containers. Jan 24 00:44:47.015355 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Jan 24 00:44:47.015365 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 24 00:44:47.015376 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 24 00:44:47.015385 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Jan 24 00:44:47.015397 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 24 00:44:47.015406 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 24 00:44:47.018077 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 24 00:44:47.018108 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Jan 24 00:44:47.018120 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 24 00:44:47.018131 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Jan 24 00:44:47.018141 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Jan 24 00:44:47.018155 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Jan 24 00:44:47.018164 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Jan 24 00:44:47.018174 systemd[1]: Stopped systemd-fsck-usr.service. Jan 24 00:44:47.018185 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 24 00:44:47.018197 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 24 00:44:47.018206 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 24 00:44:47.018216 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 24 00:44:47.018226 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Jan 24 00:44:47.018236 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Jan 24 00:44:47.018246 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 24 00:44:47.018256 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 24 00:44:47.018268 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Jan 24 00:44:47.018277 kernel: fuse: init (API version 7.41) Jan 24 00:44:47.018289 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Jan 24 00:44:47.018300 systemd[1]: Mounted media.mount - External Media Directory. Jan 24 00:44:47.018312 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Jan 24 00:44:47.018322 kernel: ACPI: bus type drm_connector registered Jan 24 00:44:47.018331 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Jan 24 00:44:47.018342 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Jan 24 00:44:47.018352 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 24 00:44:47.018361 systemd[1]: modprobe@configfs.service: Deactivated successfully. Jan 24 00:44:47.018371 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Jan 24 00:44:47.018383 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 24 00:44:47.018393 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 24 00:44:47.018404 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 24 00:44:47.018414 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 24 00:44:47.018479 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 24 00:44:47.018489 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 24 00:44:47.018499 systemd[1]: modprobe@fuse.service: Deactivated successfully. Jan 24 00:44:47.018509 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Jan 24 00:44:47.018521 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 24 00:44:47.018530 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 24 00:44:47.018548 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 24 00:44:47.018558 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 24 00:44:47.018589 systemd-journald[1255]: Collecting audit messages is enabled. Jan 24 00:44:47.018612 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Jan 24 00:44:47.018624 systemd-journald[1255]: Journal started Jan 24 00:44:47.018641 systemd-journald[1255]: Runtime Journal (/run/log/journal/8bb508c677ba42929113b30570784311) is 8M, max 76M, 68M free. Jan 24 00:44:46.697000 audit[1]: EVENT_LISTENER pid=1 uid=0 auid=4294967295 tty=(none) ses=4294967295 subj=system_u:system_r:kernel_t:s0 comm="systemd" exe="/usr/lib/systemd/systemd" nl-mcgrp=1 op=connect res=1 Jan 24 00:44:46.857000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:46.863000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:46.869000 audit: BPF prog-id=14 op=UNLOAD Jan 24 00:44:46.869000 audit: BPF prog-id=13 op=UNLOAD Jan 24 00:44:46.872000 audit: BPF prog-id=15 op=LOAD Jan 24 00:44:46.872000 audit: BPF prog-id=16 op=LOAD Jan 24 00:44:46.872000 audit: BPF prog-id=17 op=LOAD Jan 24 00:44:46.959000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:46.966000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:46.966000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:46.974000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:46.974000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:46.981000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:46.981000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:46.990000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:46.990000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:46.997000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:46.997000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:47.003000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:47.003000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:47.006000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:47.010000 audit: CONFIG_CHANGE op=set audit_enabled=1 old=1 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 res=1 Jan 24 00:44:47.010000 audit[1255]: SYSCALL arch=c000003e syscall=46 success=yes exit=60 a0=5 a1=7ffdcf468150 a2=4000 a3=0 items=0 ppid=1 pid=1255 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="systemd-journal" exe="/usr/lib/systemd/systemd-journald" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:44:47.010000 audit: PROCTITLE proctitle="/usr/lib/systemd/systemd-journald" Jan 24 00:44:47.015000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:46.480159 systemd[1]: Queued start job for default target multi-user.target. Jan 24 00:44:46.506906 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Jan 24 00:44:46.510363 systemd[1]: systemd-journald.service: Deactivated successfully. Jan 24 00:44:47.020000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=flatcar-tmpfiles comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:47.022450 systemd[1]: Started systemd-journald.service - Journal Service. Jan 24 00:44:47.022000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:47.024539 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Jan 24 00:44:47.024000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-remount-fs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:47.025359 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Jan 24 00:44:47.025000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-load-credentials comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:47.037169 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 24 00:44:47.038732 systemd[1]: Listening on systemd-importd.socket - Disk Image Download Service Socket. Jan 24 00:44:47.039225 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Jan 24 00:44:47.039280 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 24 00:44:47.040460 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Jan 24 00:44:47.041026 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 24 00:44:47.041179 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 24 00:44:47.045561 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Jan 24 00:44:47.047639 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Jan 24 00:44:47.048010 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 24 00:44:47.050612 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Jan 24 00:44:47.051070 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 24 00:44:47.057492 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 24 00:44:47.062614 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Jan 24 00:44:47.064581 systemd[1]: Starting systemd-sysusers.service - Create System Users... Jan 24 00:44:47.070784 systemd-journald[1255]: Time spent on flushing to /var/log/journal/8bb508c677ba42929113b30570784311 is 55.896ms for 1371 entries. Jan 24 00:44:47.070784 systemd-journald[1255]: System Journal (/var/log/journal/8bb508c677ba42929113b30570784311) is 8M, max 588.1M, 580.1M free. Jan 24 00:44:47.153017 systemd-journald[1255]: Received client request to flush runtime journal. Jan 24 00:44:47.153072 kernel: loop1: detected capacity change from 0 to 229808 Jan 24 00:44:47.072000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-random-seed comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:47.112000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:47.122000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:47.071470 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Jan 24 00:44:47.073177 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Jan 24 00:44:47.078606 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Jan 24 00:44:47.112791 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 24 00:44:47.155000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-flush comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:47.122719 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 24 00:44:47.158000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-machine-id-commit comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:47.155992 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Jan 24 00:44:47.157861 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Jan 24 00:44:47.162563 systemd[1]: Finished systemd-sysusers.service - Create System Users. Jan 24 00:44:47.162000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysusers comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:47.164000 audit: BPF prog-id=18 op=LOAD Jan 24 00:44:47.164000 audit: BPF prog-id=19 op=LOAD Jan 24 00:44:47.164000 audit: BPF prog-id=20 op=LOAD Jan 24 00:44:47.168000 audit: BPF prog-id=21 op=LOAD Jan 24 00:44:47.168559 systemd[1]: Starting systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer... Jan 24 00:44:47.171493 kernel: loop2: detected capacity change from 0 to 8 Jan 24 00:44:47.171748 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 24 00:44:47.173637 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 24 00:44:47.185000 audit: BPF prog-id=22 op=LOAD Jan 24 00:44:47.185000 audit: BPF prog-id=23 op=LOAD Jan 24 00:44:47.185000 audit: BPF prog-id=24 op=LOAD Jan 24 00:44:47.189000 audit: BPF prog-id=25 op=LOAD Jan 24 00:44:47.189000 audit: BPF prog-id=26 op=LOAD Jan 24 00:44:47.189000 audit: BPF prog-id=27 op=LOAD Jan 24 00:44:47.188539 systemd[1]: Starting systemd-nsresourced.service - Namespace Resource Manager... Jan 24 00:44:47.190725 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Jan 24 00:44:47.194469 kernel: loop3: detected capacity change from 0 to 50784 Jan 24 00:44:47.225604 systemd-tmpfiles[1320]: ACLs are not supported, ignoring. Jan 24 00:44:47.225622 systemd-tmpfiles[1320]: ACLs are not supported, ignoring. Jan 24 00:44:47.236136 kernel: loop4: detected capacity change from 0 to 111560 Jan 24 00:44:47.235000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:47.235213 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 24 00:44:47.271504 kernel: loop5: detected capacity change from 0 to 229808 Jan 24 00:44:47.271942 systemd-nsresourced[1321]: Not setting up BPF subsystem, as functionality has been disabled at compile time. Jan 24 00:44:47.276527 systemd[1]: Started systemd-userdbd.service - User Database Manager. Jan 24 00:44:47.276000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-userdbd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:47.281584 systemd[1]: Started systemd-nsresourced.service - Namespace Resource Manager. Jan 24 00:44:47.281000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-nsresourced comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:47.296143 kernel: loop6: detected capacity change from 0 to 8 Jan 24 00:44:47.308148 kernel: loop7: detected capacity change from 0 to 50784 Jan 24 00:44:47.335446 kernel: loop1: detected capacity change from 0 to 111560 Jan 24 00:44:47.352788 (sd-merge)[1328]: Using extensions 'containerd-flatcar.raw', 'docker-flatcar.raw', 'kubernetes.raw', 'oem-hetzner.raw'. Jan 24 00:44:47.360293 (sd-merge)[1328]: Merged extensions into '/usr'. Jan 24 00:44:47.364051 systemd-oomd[1316]: No swap; memory pressure usage will be degraded Jan 24 00:44:47.364932 systemd[1]: Started systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer. Jan 24 00:44:47.365000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-oomd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:47.369302 systemd[1]: Reload requested from client PID 1300 ('systemd-sysext') (unit systemd-sysext.service)... Jan 24 00:44:47.369313 systemd[1]: Reloading... Jan 24 00:44:47.407756 systemd-resolved[1317]: Positive Trust Anchors: Jan 24 00:44:47.408103 systemd-resolved[1317]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 24 00:44:47.408112 systemd-resolved[1317]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Jan 24 00:44:47.408145 systemd-resolved[1317]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 24 00:44:47.428110 systemd-resolved[1317]: Using system hostname 'ci-4593-0-0-9-1308b066bf'. Jan 24 00:44:47.468476 zram_generator::config[1371]: No configuration found. Jan 24 00:44:47.622231 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Jan 24 00:44:47.622562 systemd[1]: Reloading finished in 252 ms. Jan 24 00:44:47.646302 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 24 00:44:47.646000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:47.647094 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Jan 24 00:44:47.647000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-hwdb-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:47.650471 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Jan 24 00:44:47.650000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:47.651658 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 24 00:44:47.653600 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Jan 24 00:44:47.654894 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Jan 24 00:44:47.664571 systemd[1]: Starting ensure-sysext.service... Jan 24 00:44:47.666539 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 24 00:44:47.666000 audit: BPF prog-id=8 op=UNLOAD Jan 24 00:44:47.666000 audit: BPF prog-id=7 op=UNLOAD Jan 24 00:44:47.666000 audit: BPF prog-id=28 op=LOAD Jan 24 00:44:47.666000 audit: BPF prog-id=29 op=LOAD Jan 24 00:44:47.673000 audit: BPF prog-id=30 op=LOAD Jan 24 00:44:47.673000 audit: BPF prog-id=25 op=UNLOAD Jan 24 00:44:47.673000 audit: BPF prog-id=31 op=LOAD Jan 24 00:44:47.673000 audit: BPF prog-id=32 op=LOAD Jan 24 00:44:47.673000 audit: BPF prog-id=26 op=UNLOAD Jan 24 00:44:47.673000 audit: BPF prog-id=27 op=UNLOAD Jan 24 00:44:47.674000 audit: BPF prog-id=33 op=LOAD Jan 24 00:44:47.674000 audit: BPF prog-id=18 op=UNLOAD Jan 24 00:44:47.674000 audit: BPF prog-id=34 op=LOAD Jan 24 00:44:47.674000 audit: BPF prog-id=35 op=LOAD Jan 24 00:44:47.674000 audit: BPF prog-id=19 op=UNLOAD Jan 24 00:44:47.674000 audit: BPF prog-id=20 op=UNLOAD Jan 24 00:44:47.669652 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 24 00:44:47.676000 audit: BPF prog-id=36 op=LOAD Jan 24 00:44:47.676000 audit: BPF prog-id=15 op=UNLOAD Jan 24 00:44:47.677000 audit: BPF prog-id=37 op=LOAD Jan 24 00:44:47.677000 audit: BPF prog-id=38 op=LOAD Jan 24 00:44:47.677000 audit: BPF prog-id=16 op=UNLOAD Jan 24 00:44:47.677000 audit: BPF prog-id=17 op=UNLOAD Jan 24 00:44:47.677000 audit: BPF prog-id=39 op=LOAD Jan 24 00:44:47.677000 audit: BPF prog-id=21 op=UNLOAD Jan 24 00:44:47.679000 audit: BPF prog-id=40 op=LOAD Jan 24 00:44:47.679000 audit: BPF prog-id=22 op=UNLOAD Jan 24 00:44:47.679000 audit: BPF prog-id=41 op=LOAD Jan 24 00:44:47.679000 audit: BPF prog-id=42 op=LOAD Jan 24 00:44:47.679000 audit: BPF prog-id=23 op=UNLOAD Jan 24 00:44:47.679000 audit: BPF prog-id=24 op=UNLOAD Jan 24 00:44:47.687734 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Jan 24 00:44:47.688153 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Jan 24 00:44:47.699781 systemd-tmpfiles[1418]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Jan 24 00:44:47.700096 systemd-tmpfiles[1418]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Jan 24 00:44:47.700377 systemd-tmpfiles[1418]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Jan 24 00:44:47.701471 systemd-tmpfiles[1418]: ACLs are not supported, ignoring. Jan 24 00:44:47.701576 systemd-tmpfiles[1418]: ACLs are not supported, ignoring. Jan 24 00:44:47.705598 systemd[1]: Reload requested from client PID 1417 ('systemctl') (unit ensure-sysext.service)... Jan 24 00:44:47.705890 systemd[1]: Reloading... Jan 24 00:44:47.717323 systemd-tmpfiles[1418]: Detected autofs mount point /boot during canonicalization of boot. Jan 24 00:44:47.719734 systemd-tmpfiles[1418]: Skipping /boot Jan 24 00:44:47.737767 systemd-tmpfiles[1418]: Detected autofs mount point /boot during canonicalization of boot. Jan 24 00:44:47.739754 systemd-tmpfiles[1418]: Skipping /boot Jan 24 00:44:47.741029 systemd-udevd[1419]: Using default interface naming scheme 'v257'. Jan 24 00:44:47.807451 zram_generator::config[1455]: No configuration found. Jan 24 00:44:47.985443 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input4 Jan 24 00:44:47.990444 kernel: mousedev: PS/2 mouse device common for all mice Jan 24 00:44:48.010141 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Jan 24 00:44:48.010352 systemd[1]: Reloading finished in 304 ms. Jan 24 00:44:48.017632 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 24 00:44:48.017000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:48.019000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:48.019673 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 24 00:44:48.025000 audit: BPF prog-id=43 op=LOAD Jan 24 00:44:48.025000 audit: BPF prog-id=40 op=UNLOAD Jan 24 00:44:48.025000 audit: BPF prog-id=44 op=LOAD Jan 24 00:44:48.025000 audit: BPF prog-id=45 op=LOAD Jan 24 00:44:48.025000 audit: BPF prog-id=41 op=UNLOAD Jan 24 00:44:48.025000 audit: BPF prog-id=42 op=UNLOAD Jan 24 00:44:48.026000 audit: BPF prog-id=46 op=LOAD Jan 24 00:44:48.028000 audit: BPF prog-id=39 op=UNLOAD Jan 24 00:44:48.028000 audit: BPF prog-id=47 op=LOAD Jan 24 00:44:48.028000 audit: BPF prog-id=36 op=UNLOAD Jan 24 00:44:48.028000 audit: BPF prog-id=48 op=LOAD Jan 24 00:44:48.029000 audit: BPF prog-id=49 op=LOAD Jan 24 00:44:48.029000 audit: BPF prog-id=37 op=UNLOAD Jan 24 00:44:48.029000 audit: BPF prog-id=38 op=UNLOAD Jan 24 00:44:48.029000 audit: BPF prog-id=50 op=LOAD Jan 24 00:44:48.035486 kernel: ACPI: button: Power Button [PWRF] Jan 24 00:44:48.030000 audit: BPF prog-id=30 op=UNLOAD Jan 24 00:44:48.030000 audit: BPF prog-id=51 op=LOAD Jan 24 00:44:48.031000 audit: BPF prog-id=52 op=LOAD Jan 24 00:44:48.031000 audit: BPF prog-id=31 op=UNLOAD Jan 24 00:44:48.031000 audit: BPF prog-id=32 op=UNLOAD Jan 24 00:44:48.032000 audit: BPF prog-id=53 op=LOAD Jan 24 00:44:48.032000 audit: BPF prog-id=54 op=LOAD Jan 24 00:44:48.032000 audit: BPF prog-id=28 op=UNLOAD Jan 24 00:44:48.032000 audit: BPF prog-id=29 op=UNLOAD Jan 24 00:44:48.032000 audit: BPF prog-id=55 op=LOAD Jan 24 00:44:48.032000 audit: BPF prog-id=33 op=UNLOAD Jan 24 00:44:48.034000 audit: BPF prog-id=56 op=LOAD Jan 24 00:44:48.034000 audit: BPF prog-id=57 op=LOAD Jan 24 00:44:48.034000 audit: BPF prog-id=34 op=UNLOAD Jan 24 00:44:48.034000 audit: BPF prog-id=35 op=UNLOAD Jan 24 00:44:48.061194 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. Jan 24 00:44:48.067295 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 24 00:44:48.070611 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 24 00:44:48.072668 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Jan 24 00:44:48.073708 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 24 00:44:48.076091 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 24 00:44:48.077737 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 24 00:44:48.084491 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 24 00:44:48.084974 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 24 00:44:48.085117 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 24 00:44:48.087278 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Jan 24 00:44:48.088468 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 24 00:44:48.093694 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Jan 24 00:44:48.096000 audit: BPF prog-id=58 op=LOAD Jan 24 00:44:48.097679 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 24 00:44:48.100661 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Jan 24 00:44:48.103176 kernel: i801_smbus 0000:00:1f.3: Enabling SMBus device Jan 24 00:44:48.105000 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Jan 24 00:44:48.105200 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Jan 24 00:44:48.103525 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 24 00:44:48.108000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:48.108000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:48.105855 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 24 00:44:48.107709 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 24 00:44:48.115524 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 24 00:44:48.116275 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 24 00:44:48.118556 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 24 00:44:48.119647 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 24 00:44:48.119789 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 24 00:44:48.119850 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 24 00:44:48.119907 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 24 00:44:48.124405 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 24 00:44:48.125232 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 24 00:44:48.133196 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 24 00:44:48.133691 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 24 00:44:48.133823 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 24 00:44:48.133886 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 24 00:44:48.133969 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 24 00:44:48.150000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=ensure-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:48.150001 systemd[1]: Finished ensure-sysext.service. Jan 24 00:44:48.158000 audit: BPF prog-id=59 op=LOAD Jan 24 00:44:48.160878 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Jan 24 00:44:48.162044 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 24 00:44:48.163037 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 24 00:44:48.165000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:48.165000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:48.165826 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 24 00:44:48.166028 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 24 00:44:48.165000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:48.165000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:48.168292 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 24 00:44:48.186985 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Jan 24 00:44:48.187000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=clean-ca-certificates comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:48.188134 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Jan 24 00:44:48.196000 audit[1554]: SYSTEM_BOOT pid=1554 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg=' comm="systemd-update-utmp" exe="/usr/lib/systemd/systemd-update-utmp" hostname=? addr=? terminal=? res=success' Jan 24 00:44:48.207542 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Jan 24 00:44:48.208000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-update-utmp comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:48.215651 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 24 00:44:48.215860 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 24 00:44:48.215000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:48.215000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:48.227260 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 24 00:44:48.227525 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 24 00:44:48.227000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:48.227000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:48.229446 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 24 00:44:48.248843 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Jan 24 00:44:48.249000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-catalog-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:48.257713 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Jan 24 00:44:48.259720 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Jan 24 00:44:48.272206 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 24 00:44:48.294000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=add_rule key=(null) list=5 res=1 Jan 24 00:44:48.294000 audit[1598]: SYSCALL arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7ffd5c6f6bc0 a2=420 a3=0 items=0 ppid=1538 pid=1598 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:44:48.294000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 24 00:44:48.297291 augenrules[1598]: No rules Jan 24 00:44:48.296321 systemd[1]: audit-rules.service: Deactivated successfully. Jan 24 00:44:48.297603 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 24 00:44:48.308462 kernel: [drm] pci: virtio-vga detected at 0000:00:01.0 Jan 24 00:44:48.314886 kernel: Console: switching to colour dummy device 80x25 Jan 24 00:44:48.316175 kernel: EDAC MC: Ver: 3.0.0 Jan 24 00:44:48.331188 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Jan 24 00:44:48.331447 kernel: virtio-pci 0000:00:01.0: vgaarb: deactivate vga console Jan 24 00:44:48.351856 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Jan 24 00:44:48.351928 kernel: [drm] features: -context_init Jan 24 00:44:48.351353 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 24 00:44:48.351718 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 24 00:44:48.354411 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 24 00:44:48.370878 kernel: [drm] number of scanouts: 1 Jan 24 00:44:48.370932 kernel: [drm] number of cap sets: 0 Jan 24 00:44:48.370964 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0 Jan 24 00:44:48.380577 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device Jan 24 00:44:48.380759 kernel: Console: switching to colour frame buffer device 160x50 Jan 24 00:44:48.389681 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Jan 24 00:44:48.406374 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 24 00:44:48.407568 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 24 00:44:48.414650 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 24 00:44:48.511550 systemd-networkd[1553]: lo: Link UP Jan 24 00:44:48.512012 systemd-networkd[1553]: lo: Gained carrier Jan 24 00:44:48.521855 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 24 00:44:48.523833 systemd[1]: Reached target network.target - Network. Jan 24 00:44:48.525914 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Jan 24 00:44:48.529842 systemd-networkd[1553]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 24 00:44:48.529850 systemd-networkd[1553]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 24 00:44:48.530686 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Jan 24 00:44:48.530868 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Jan 24 00:44:48.530984 systemd[1]: Reached target time-set.target - System Time Set. Jan 24 00:44:48.534196 systemd-networkd[1553]: eth1: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 24 00:44:48.534203 systemd-networkd[1553]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 24 00:44:48.539969 systemd-networkd[1553]: eth0: Link UP Jan 24 00:44:48.541628 systemd-networkd[1553]: eth0: Gained carrier Jan 24 00:44:48.541646 systemd-networkd[1553]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 24 00:44:48.548758 systemd-networkd[1553]: eth1: Link UP Jan 24 00:44:48.553121 systemd-networkd[1553]: eth1: Gained carrier Jan 24 00:44:48.553142 systemd-networkd[1553]: eth1: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 24 00:44:48.599525 systemd-networkd[1553]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Jan 24 00:44:48.600968 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 24 00:44:48.603490 systemd-networkd[1553]: eth0: DHCPv4 address 65.109.167.77/32, gateway 172.31.1.1 acquired from 172.31.1.1 Jan 24 00:44:48.604197 systemd-timesyncd[1568]: Network configuration changed, trying to establish connection. Jan 24 00:44:48.607915 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Jan 24 00:44:48.678309 ldconfig[1549]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Jan 24 00:44:48.684382 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Jan 24 00:44:48.687096 systemd[1]: Starting systemd-update-done.service - Update is Completed... Jan 24 00:44:48.707459 systemd[1]: Finished systemd-update-done.service - Update is Completed. Jan 24 00:44:48.708654 systemd[1]: Reached target sysinit.target - System Initialization. Jan 24 00:44:48.710852 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Jan 24 00:44:48.711051 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Jan 24 00:44:48.711211 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Jan 24 00:44:48.711636 systemd[1]: Started logrotate.timer - Daily rotation of log files. Jan 24 00:44:48.711876 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Jan 24 00:44:48.712032 systemd[1]: Started systemd-sysupdate-reboot.timer - Reboot Automatically After System Update. Jan 24 00:44:48.712373 systemd[1]: Started systemd-sysupdate.timer - Automatic System Update. Jan 24 00:44:48.712694 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Jan 24 00:44:48.712789 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Jan 24 00:44:48.712813 systemd[1]: Reached target paths.target - Path Units. Jan 24 00:44:48.713183 systemd[1]: Reached target timers.target - Timer Units. Jan 24 00:44:48.714056 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Jan 24 00:44:48.715935 systemd[1]: Starting docker.socket - Docker Socket for the API... Jan 24 00:44:48.722945 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Jan 24 00:44:48.724576 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Jan 24 00:44:48.724948 systemd[1]: Reached target ssh-access.target - SSH Access Available. Jan 24 00:44:48.727953 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Jan 24 00:44:48.730812 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Jan 24 00:44:48.734988 systemd[1]: Listening on docker.socket - Docker Socket for the API. Jan 24 00:44:48.737486 systemd[1]: Reached target sockets.target - Socket Units. Jan 24 00:44:48.738296 systemd[1]: Reached target basic.target - Basic System. Jan 24 00:44:48.739781 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Jan 24 00:44:48.739807 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Jan 24 00:44:48.740895 systemd[1]: Starting containerd.service - containerd container runtime... Jan 24 00:44:48.742608 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Jan 24 00:44:48.748588 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Jan 24 00:44:48.750641 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Jan 24 00:44:48.755664 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Jan 24 00:44:48.763052 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Jan 24 00:44:48.764834 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Jan 24 00:44:48.766869 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Jan 24 00:44:48.771751 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Jan 24 00:44:48.778672 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Jan 24 00:44:48.785834 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. Jan 24 00:44:48.788711 extend-filesystems[1634]: Found /dev/sda6 Jan 24 00:44:48.798994 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Jan 24 00:44:48.807078 jq[1633]: false Jan 24 00:44:48.809907 extend-filesystems[1634]: Found /dev/sda9 Jan 24 00:44:48.810803 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Jan 24 00:44:48.831572 systemd[1]: Starting systemd-logind.service - User Login Management... Jan 24 00:44:48.833611 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Jan 24 00:44:48.834085 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Jan 24 00:44:48.836454 google_oslogin_nss_cache[1635]: oslogin_cache_refresh[1635]: Refreshing passwd entry cache Jan 24 00:44:48.835206 oslogin_cache_refresh[1635]: Refreshing passwd entry cache Jan 24 00:44:48.836756 systemd[1]: Starting update-engine.service - Update Engine... Jan 24 00:44:48.842536 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Jan 24 00:44:48.847884 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Jan 24 00:44:48.851498 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Jan 24 00:44:48.851750 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Jan 24 00:44:48.852033 systemd[1]: motdgen.service: Deactivated successfully. Jan 24 00:44:48.852219 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Jan 24 00:44:48.853018 coreos-metadata[1630]: Jan 24 00:44:48.852 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 Jan 24 00:44:48.854019 google_oslogin_nss_cache[1635]: oslogin_cache_refresh[1635]: Failure getting users, quitting Jan 24 00:44:48.854019 google_oslogin_nss_cache[1635]: oslogin_cache_refresh[1635]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Jan 24 00:44:48.854019 google_oslogin_nss_cache[1635]: oslogin_cache_refresh[1635]: Refreshing group entry cache Jan 24 00:44:48.853655 oslogin_cache_refresh[1635]: Failure getting users, quitting Jan 24 00:44:48.853679 oslogin_cache_refresh[1635]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Jan 24 00:44:48.853717 oslogin_cache_refresh[1635]: Refreshing group entry cache Jan 24 00:44:48.854319 extend-filesystems[1634]: Checking size of /dev/sda9 Jan 24 00:44:48.855734 google_oslogin_nss_cache[1635]: oslogin_cache_refresh[1635]: Failure getting groups, quitting Jan 24 00:44:48.855734 google_oslogin_nss_cache[1635]: oslogin_cache_refresh[1635]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Jan 24 00:44:48.854814 oslogin_cache_refresh[1635]: Failure getting groups, quitting Jan 24 00:44:48.854823 oslogin_cache_refresh[1635]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Jan 24 00:44:48.861816 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Jan 24 00:44:48.862070 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Jan 24 00:44:48.862896 jq[1660]: true Jan 24 00:44:48.865641 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Jan 24 00:44:48.865846 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Jan 24 00:44:48.875502 coreos-metadata[1630]: Jan 24 00:44:48.875 INFO Fetch successful Jan 24 00:44:48.875502 coreos-metadata[1630]: Jan 24 00:44:48.875 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 Jan 24 00:44:48.877403 coreos-metadata[1630]: Jan 24 00:44:48.877 INFO Fetch successful Jan 24 00:44:48.883510 extend-filesystems[1634]: Resized partition /dev/sda9 Jan 24 00:44:48.892682 extend-filesystems[1684]: resize2fs 1.47.3 (8-Jul-2025) Jan 24 00:44:48.899290 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 18410491 blocks Jan 24 00:44:48.360513 systemd-timesyncd[1568]: Contacted time server 85.10.240.253:123 (0.flatcar.pool.ntp.org). Jan 24 00:44:48.394621 systemd-journald[1255]: Time jumped backwards, rotating. Jan 24 00:44:48.399994 tar[1663]: linux-amd64/LICENSE Jan 24 00:44:48.399994 tar[1663]: linux-amd64/helm Jan 24 00:44:48.360533 systemd-resolved[1317]: Clock change detected. Flushing caches. Jan 24 00:44:48.403801 jq[1668]: true Jan 24 00:44:48.360562 systemd-timesyncd[1568]: Initial clock synchronization to Sat 2026-01-24 00:44:48.359904 UTC. Jan 24 00:44:48.428782 dbus-daemon[1631]: [system] SELinux support is enabled Jan 24 00:44:48.429036 systemd[1]: Started dbus.service - D-Bus System Message Bus. Jan 24 00:44:48.437144 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Jan 24 00:44:48.437172 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Jan 24 00:44:48.439948 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Jan 24 00:44:48.439965 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Jan 24 00:44:48.440809 update_engine[1659]: I20260124 00:44:48.440730 1659 main.cc:92] Flatcar Update Engine starting Jan 24 00:44:48.457858 systemd[1]: Started update-engine.service - Update Engine. Jan 24 00:44:48.462018 update_engine[1659]: I20260124 00:44:48.461236 1659 update_check_scheduler.cc:74] Next update check in 8m49s Jan 24 00:44:48.465139 systemd[1]: Started locksmithd.service - Cluster reboot manager. Jan 24 00:44:48.473450 systemd-logind[1656]: New seat seat0. Jan 24 00:44:48.482420 systemd-logind[1656]: Watching system buttons on /dev/input/event3 (Power Button) Jan 24 00:44:48.482442 systemd-logind[1656]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Jan 24 00:44:48.482694 systemd[1]: Started systemd-logind.service - User Login Management. Jan 24 00:44:48.553151 bash[1712]: Updated "/home/core/.ssh/authorized_keys" Jan 24 00:44:48.554184 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Jan 24 00:44:48.571427 systemd[1]: Starting sshkeys.service... Jan 24 00:44:48.621524 containerd[1682]: time="2026-01-24T00:44:48Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Jan 24 00:44:48.623111 containerd[1682]: time="2026-01-24T00:44:48.622927910Z" level=info msg="starting containerd" revision=fcd43222d6b07379a4be9786bda52438f0dd16a1 version=v2.1.5 Jan 24 00:44:48.628596 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Jan 24 00:44:48.635727 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Jan 24 00:44:48.639097 containerd[1682]: time="2026-01-24T00:44:48.637245972Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="7.48µs" Jan 24 00:44:48.639097 containerd[1682]: time="2026-01-24T00:44:48.637276392Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Jan 24 00:44:48.639097 containerd[1682]: time="2026-01-24T00:44:48.637312162Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Jan 24 00:44:48.639097 containerd[1682]: time="2026-01-24T00:44:48.637321892Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Jan 24 00:44:48.639097 containerd[1682]: time="2026-01-24T00:44:48.637451383Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Jan 24 00:44:48.639097 containerd[1682]: time="2026-01-24T00:44:48.637464273Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jan 24 00:44:48.639097 containerd[1682]: time="2026-01-24T00:44:48.637516403Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jan 24 00:44:48.639097 containerd[1682]: time="2026-01-24T00:44:48.637527153Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jan 24 00:44:48.639097 containerd[1682]: time="2026-01-24T00:44:48.637721163Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jan 24 00:44:48.639097 containerd[1682]: time="2026-01-24T00:44:48.637735933Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jan 24 00:44:48.639097 containerd[1682]: time="2026-01-24T00:44:48.637746173Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jan 24 00:44:48.639097 containerd[1682]: time="2026-01-24T00:44:48.637752513Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Jan 24 00:44:48.639288 containerd[1682]: time="2026-01-24T00:44:48.637887783Z" level=info msg="skip loading plugin" error="EROFS unsupported, please `modprobe erofs`: skip plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Jan 24 00:44:48.639288 containerd[1682]: time="2026-01-24T00:44:48.637898313Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Jan 24 00:44:48.639288 containerd[1682]: time="2026-01-24T00:44:48.637968863Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Jan 24 00:44:48.639369 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Jan 24 00:44:48.640707 containerd[1682]: time="2026-01-24T00:44:48.640690115Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jan 24 00:44:48.640811 containerd[1682]: time="2026-01-24T00:44:48.640798875Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jan 24 00:44:48.641342 containerd[1682]: time="2026-01-24T00:44:48.641328096Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Jan 24 00:44:48.641685 containerd[1682]: time="2026-01-24T00:44:48.641671716Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Jan 24 00:44:48.642448 containerd[1682]: time="2026-01-24T00:44:48.642432397Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Jan 24 00:44:48.642584 containerd[1682]: time="2026-01-24T00:44:48.642572497Z" level=info msg="metadata content store policy set" policy=shared Jan 24 00:44:48.643286 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Jan 24 00:44:48.665382 containerd[1682]: time="2026-01-24T00:44:48.663730674Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Jan 24 00:44:48.665382 containerd[1682]: time="2026-01-24T00:44:48.663788555Z" level=info msg="loading plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Jan 24 00:44:48.665382 containerd[1682]: time="2026-01-24T00:44:48.663855555Z" level=info msg="skip loading plugin" error="could not find mkfs.erofs: exec: \"mkfs.erofs\": executable file not found in $PATH: skip plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Jan 24 00:44:48.665382 containerd[1682]: time="2026-01-24T00:44:48.663866905Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Jan 24 00:44:48.665382 containerd[1682]: time="2026-01-24T00:44:48.663877275Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Jan 24 00:44:48.665382 containerd[1682]: time="2026-01-24T00:44:48.663886935Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Jan 24 00:44:48.665382 containerd[1682]: time="2026-01-24T00:44:48.663896555Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Jan 24 00:44:48.665382 containerd[1682]: time="2026-01-24T00:44:48.663904905Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Jan 24 00:44:48.665382 containerd[1682]: time="2026-01-24T00:44:48.663917395Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Jan 24 00:44:48.665382 containerd[1682]: time="2026-01-24T00:44:48.663926785Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Jan 24 00:44:48.665382 containerd[1682]: time="2026-01-24T00:44:48.663935415Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Jan 24 00:44:48.665382 containerd[1682]: time="2026-01-24T00:44:48.663943315Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Jan 24 00:44:48.665382 containerd[1682]: time="2026-01-24T00:44:48.663951195Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Jan 24 00:44:48.665382 containerd[1682]: time="2026-01-24T00:44:48.663960215Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Jan 24 00:44:48.665620 containerd[1682]: time="2026-01-24T00:44:48.664075455Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Jan 24 00:44:48.665620 containerd[1682]: time="2026-01-24T00:44:48.664105355Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Jan 24 00:44:48.665620 containerd[1682]: time="2026-01-24T00:44:48.664117005Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Jan 24 00:44:48.665620 containerd[1682]: time="2026-01-24T00:44:48.664125145Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Jan 24 00:44:48.665620 containerd[1682]: time="2026-01-24T00:44:48.664133005Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Jan 24 00:44:48.665620 containerd[1682]: time="2026-01-24T00:44:48.664140945Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Jan 24 00:44:48.665620 containerd[1682]: time="2026-01-24T00:44:48.664150005Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Jan 24 00:44:48.665620 containerd[1682]: time="2026-01-24T00:44:48.664159295Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Jan 24 00:44:48.665620 containerd[1682]: time="2026-01-24T00:44:48.664167275Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Jan 24 00:44:48.665620 containerd[1682]: time="2026-01-24T00:44:48.664175765Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Jan 24 00:44:48.665620 containerd[1682]: time="2026-01-24T00:44:48.664197795Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Jan 24 00:44:48.665620 containerd[1682]: time="2026-01-24T00:44:48.664224635Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Jan 24 00:44:48.665620 containerd[1682]: time="2026-01-24T00:44:48.664255025Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Jan 24 00:44:48.665620 containerd[1682]: time="2026-01-24T00:44:48.664266235Z" level=info msg="Start snapshots syncer" Jan 24 00:44:48.665620 containerd[1682]: time="2026-01-24T00:44:48.664281905Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Jan 24 00:44:48.665790 containerd[1682]: time="2026-01-24T00:44:48.664458585Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"cgroupWritable\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"\",\"binDirs\":[\"/opt/cni/bin\"],\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogLineSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Jan 24 00:44:48.665790 containerd[1682]: time="2026-01-24T00:44:48.664494945Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Jan 24 00:44:48.665880 containerd[1682]: time="2026-01-24T00:44:48.664529225Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Jan 24 00:44:48.665880 containerd[1682]: time="2026-01-24T00:44:48.664608505Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Jan 24 00:44:48.665880 containerd[1682]: time="2026-01-24T00:44:48.664621605Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Jan 24 00:44:48.665880 containerd[1682]: time="2026-01-24T00:44:48.664629235Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Jan 24 00:44:48.665880 containerd[1682]: time="2026-01-24T00:44:48.664636235Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Jan 24 00:44:48.665880 containerd[1682]: time="2026-01-24T00:44:48.664645875Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Jan 24 00:44:48.665880 containerd[1682]: time="2026-01-24T00:44:48.664653835Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Jan 24 00:44:48.665880 containerd[1682]: time="2026-01-24T00:44:48.664661205Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Jan 24 00:44:48.665880 containerd[1682]: time="2026-01-24T00:44:48.664668325Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Jan 24 00:44:48.665880 containerd[1682]: time="2026-01-24T00:44:48.664676505Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Jan 24 00:44:48.665880 containerd[1682]: time="2026-01-24T00:44:48.664695795Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jan 24 00:44:48.665880 containerd[1682]: time="2026-01-24T00:44:48.664704885Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jan 24 00:44:48.665880 containerd[1682]: time="2026-01-24T00:44:48.664725275Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jan 24 00:44:48.666024 containerd[1682]: time="2026-01-24T00:44:48.664732845Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jan 24 00:44:48.666024 containerd[1682]: time="2026-01-24T00:44:48.664738635Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Jan 24 00:44:48.666024 containerd[1682]: time="2026-01-24T00:44:48.664748035Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Jan 24 00:44:48.666024 containerd[1682]: time="2026-01-24T00:44:48.664755565Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Jan 24 00:44:48.666024 containerd[1682]: time="2026-01-24T00:44:48.664764765Z" level=info msg="runtime interface created" Jan 24 00:44:48.666024 containerd[1682]: time="2026-01-24T00:44:48.664768715Z" level=info msg="created NRI interface" Jan 24 00:44:48.666024 containerd[1682]: time="2026-01-24T00:44:48.664774775Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Jan 24 00:44:48.666024 containerd[1682]: time="2026-01-24T00:44:48.664782575Z" level=info msg="Connect containerd service" Jan 24 00:44:48.666024 containerd[1682]: time="2026-01-24T00:44:48.664798495Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Jan 24 00:44:48.668542 containerd[1682]: time="2026-01-24T00:44:48.668521688Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 24 00:44:48.673078 kernel: EXT4-fs (sda9): resized filesystem to 18410491 Jan 24 00:44:48.711096 extend-filesystems[1684]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Jan 24 00:44:48.711096 extend-filesystems[1684]: old_desc_blocks = 1, new_desc_blocks = 9 Jan 24 00:44:48.711096 extend-filesystems[1684]: The filesystem on /dev/sda9 is now 18410491 (4k) blocks long. Jan 24 00:44:48.718753 extend-filesystems[1634]: Resized filesystem in /dev/sda9 Jan 24 00:44:48.726524 coreos-metadata[1725]: Jan 24 00:44:48.723 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 Jan 24 00:44:48.721007 systemd[1]: extend-filesystems.service: Deactivated successfully. Jan 24 00:44:48.721895 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Jan 24 00:44:48.729261 coreos-metadata[1725]: Jan 24 00:44:48.728 INFO Fetch successful Jan 24 00:44:48.730803 unknown[1725]: wrote ssh authorized keys file for user: core Jan 24 00:44:48.777233 update-ssh-keys[1737]: Updated "/home/core/.ssh/authorized_keys" Jan 24 00:44:48.778562 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Jan 24 00:44:48.786550 systemd[1]: Finished sshkeys.service. Jan 24 00:44:48.808760 containerd[1682]: time="2026-01-24T00:44:48.808724745Z" level=info msg="Start subscribing containerd event" Jan 24 00:44:48.811103 containerd[1682]: time="2026-01-24T00:44:48.808767435Z" level=info msg="Start recovering state" Jan 24 00:44:48.811314 containerd[1682]: time="2026-01-24T00:44:48.811296237Z" level=info msg="Start event monitor" Jan 24 00:44:48.811459 containerd[1682]: time="2026-01-24T00:44:48.811428548Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Jan 24 00:44:48.811524 containerd[1682]: time="2026-01-24T00:44:48.811508788Z" level=info msg="Start cni network conf syncer for default" Jan 24 00:44:48.811540 containerd[1682]: time="2026-01-24T00:44:48.811522978Z" level=info msg="Start streaming server" Jan 24 00:44:48.811540 containerd[1682]: time="2026-01-24T00:44:48.811533168Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Jan 24 00:44:48.811571 containerd[1682]: time="2026-01-24T00:44:48.811541018Z" level=info msg="runtime interface starting up..." Jan 24 00:44:48.811571 containerd[1682]: time="2026-01-24T00:44:48.811547288Z" level=info msg="starting plugins..." Jan 24 00:44:48.811571 containerd[1682]: time="2026-01-24T00:44:48.811563528Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Jan 24 00:44:48.811609 containerd[1682]: time="2026-01-24T00:44:48.811487158Z" level=info msg=serving... address=/run/containerd/containerd.sock Jan 24 00:44:48.812170 systemd[1]: Started containerd.service - containerd container runtime. Jan 24 00:44:48.814271 containerd[1682]: time="2026-01-24T00:44:48.814247420Z" level=info msg="containerd successfully booted in 0.194969s" Jan 24 00:44:48.828396 locksmithd[1702]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Jan 24 00:44:48.907013 sshd_keygen[1658]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Jan 24 00:44:48.925496 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Jan 24 00:44:48.928080 tar[1663]: linux-amd64/README.md Jan 24 00:44:48.930451 systemd[1]: Starting issuegen.service - Generate /run/issue... Jan 24 00:44:48.944529 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Jan 24 00:44:48.947643 systemd[1]: issuegen.service: Deactivated successfully. Jan 24 00:44:48.947896 systemd[1]: Finished issuegen.service - Generate /run/issue. Jan 24 00:44:48.951622 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Jan 24 00:44:48.964799 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Jan 24 00:44:48.967987 systemd[1]: Started getty@tty1.service - Getty on tty1. Jan 24 00:44:48.971574 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Jan 24 00:44:48.972461 systemd[1]: Reached target getty.target - Login Prompts. Jan 24 00:44:49.211494 systemd-networkd[1553]: eth1: Gained IPv6LL Jan 24 00:44:49.216388 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Jan 24 00:44:49.218529 systemd[1]: Reached target network-online.target - Network is Online. Jan 24 00:44:49.225875 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 24 00:44:49.231183 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Jan 24 00:44:49.281844 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Jan 24 00:44:49.851351 systemd-networkd[1553]: eth0: Gained IPv6LL Jan 24 00:44:50.619245 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 24 00:44:50.622000 systemd[1]: Reached target multi-user.target - Multi-User System. Jan 24 00:44:50.625766 systemd[1]: Startup finished in 3.900s (kernel) + 5.984s (initrd) + 5.931s (userspace) = 15.816s. Jan 24 00:44:50.630880 (kubelet)[1787]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 24 00:44:51.472994 kubelet[1787]: E0124 00:44:51.472891 1787 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 24 00:44:51.478738 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 24 00:44:51.479113 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 24 00:44:51.479880 systemd[1]: kubelet.service: Consumed 1.742s CPU time, 268.9M memory peak. Jan 24 00:44:51.918731 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Jan 24 00:44:51.921732 systemd[1]: Started sshd@0-65.109.167.77:22-4.153.228.146:37322.service - OpenSSH per-connection server daemon (4.153.228.146:37322). Jan 24 00:44:52.658799 sshd[1799]: Accepted publickey for core from 4.153.228.146 port 37322 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:44:52.662493 sshd-session[1799]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:44:52.674420 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Jan 24 00:44:52.676820 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Jan 24 00:44:52.687222 systemd-logind[1656]: New session 1 of user core. Jan 24 00:44:52.713257 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Jan 24 00:44:52.722017 systemd[1]: Starting user@500.service - User Manager for UID 500... Jan 24 00:44:52.751220 (systemd)[1805]: pam_unix(systemd-user:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:44:52.756759 systemd-logind[1656]: New session 2 of user core. Jan 24 00:44:52.913763 systemd[1805]: Queued start job for default target default.target. Jan 24 00:44:52.924373 systemd[1805]: Created slice app.slice - User Application Slice. Jan 24 00:44:52.924396 systemd[1805]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of User's Temporary Directories. Jan 24 00:44:52.924408 systemd[1805]: Reached target paths.target - Paths. Jan 24 00:44:52.924450 systemd[1805]: Reached target timers.target - Timers. Jan 24 00:44:52.925883 systemd[1805]: Starting dbus.socket - D-Bus User Message Bus Socket... Jan 24 00:44:52.929188 systemd[1805]: Starting systemd-tmpfiles-setup.service - Create User Files and Directories... Jan 24 00:44:52.937227 systemd[1805]: Finished systemd-tmpfiles-setup.service - Create User Files and Directories. Jan 24 00:44:52.946296 systemd[1805]: Listening on dbus.socket - D-Bus User Message Bus Socket. Jan 24 00:44:52.946392 systemd[1805]: Reached target sockets.target - Sockets. Jan 24 00:44:52.946538 systemd[1805]: Reached target basic.target - Basic System. Jan 24 00:44:52.946725 systemd[1]: Started user@500.service - User Manager for UID 500. Jan 24 00:44:52.947028 systemd[1805]: Reached target default.target - Main User Target. Jan 24 00:44:52.947077 systemd[1805]: Startup finished in 180ms. Jan 24 00:44:52.954384 systemd[1]: Started session-1.scope - Session 1 of User core. Jan 24 00:44:53.342318 systemd[1]: Started sshd@1-65.109.167.77:22-4.153.228.146:37334.service - OpenSSH per-connection server daemon (4.153.228.146:37334). Jan 24 00:44:54.037230 sshd[1819]: Accepted publickey for core from 4.153.228.146 port 37334 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:44:54.040737 sshd-session[1819]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:44:54.052260 systemd-logind[1656]: New session 3 of user core. Jan 24 00:44:54.058359 systemd[1]: Started session-3.scope - Session 3 of User core. Jan 24 00:44:54.416360 sshd[1823]: Connection closed by 4.153.228.146 port 37334 Jan 24 00:44:54.417441 sshd-session[1819]: pam_unix(sshd:session): session closed for user core Jan 24 00:44:54.425134 systemd[1]: sshd@1-65.109.167.77:22-4.153.228.146:37334.service: Deactivated successfully. Jan 24 00:44:54.429290 systemd[1]: session-3.scope: Deactivated successfully. Jan 24 00:44:54.431570 systemd-logind[1656]: Session 3 logged out. Waiting for processes to exit. Jan 24 00:44:54.435289 systemd-logind[1656]: Removed session 3. Jan 24 00:44:54.563519 systemd[1]: Started sshd@2-65.109.167.77:22-4.153.228.146:37336.service - OpenSSH per-connection server daemon (4.153.228.146:37336). Jan 24 00:44:55.254164 sshd[1829]: Accepted publickey for core from 4.153.228.146 port 37336 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:44:55.256993 sshd-session[1829]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:44:55.266486 systemd-logind[1656]: New session 4 of user core. Jan 24 00:44:55.278393 systemd[1]: Started session-4.scope - Session 4 of User core. Jan 24 00:44:55.622623 sshd[1833]: Connection closed by 4.153.228.146 port 37336 Jan 24 00:44:55.624475 sshd-session[1829]: pam_unix(sshd:session): session closed for user core Jan 24 00:44:55.632195 systemd[1]: sshd@2-65.109.167.77:22-4.153.228.146:37336.service: Deactivated successfully. Jan 24 00:44:55.635753 systemd[1]: session-4.scope: Deactivated successfully. Jan 24 00:44:55.638659 systemd-logind[1656]: Session 4 logged out. Waiting for processes to exit. Jan 24 00:44:55.641241 systemd-logind[1656]: Removed session 4. Jan 24 00:44:55.770401 systemd[1]: Started sshd@3-65.109.167.77:22-4.153.228.146:50532.service - OpenSSH per-connection server daemon (4.153.228.146:50532). Jan 24 00:44:56.432124 sshd[1839]: Accepted publickey for core from 4.153.228.146 port 50532 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:44:56.433716 sshd-session[1839]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:44:56.439238 systemd-logind[1656]: New session 5 of user core. Jan 24 00:44:56.445226 systemd[1]: Started session-5.scope - Session 5 of User core. Jan 24 00:44:56.802141 sshd[1843]: Connection closed by 4.153.228.146 port 50532 Jan 24 00:44:56.804226 sshd-session[1839]: pam_unix(sshd:session): session closed for user core Jan 24 00:44:56.808034 systemd[1]: sshd@3-65.109.167.77:22-4.153.228.146:50532.service: Deactivated successfully. Jan 24 00:44:56.810199 systemd[1]: session-5.scope: Deactivated successfully. Jan 24 00:44:56.811516 systemd-logind[1656]: Session 5 logged out. Waiting for processes to exit. Jan 24 00:44:56.813000 systemd-logind[1656]: Removed session 5. Jan 24 00:44:56.937976 systemd[1]: Started sshd@4-65.109.167.77:22-4.153.228.146:50534.service - OpenSSH per-connection server daemon (4.153.228.146:50534). Jan 24 00:44:57.592134 sshd[1849]: Accepted publickey for core from 4.153.228.146 port 50534 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:44:57.593966 sshd-session[1849]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:44:57.603550 systemd-logind[1656]: New session 6 of user core. Jan 24 00:44:57.606376 systemd[1]: Started session-6.scope - Session 6 of User core. Jan 24 00:44:57.861853 sudo[1854]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Jan 24 00:44:57.862619 sudo[1854]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 24 00:44:57.879244 sudo[1854]: pam_unix(sudo:session): session closed for user root Jan 24 00:44:58.001102 sshd[1853]: Connection closed by 4.153.228.146 port 50534 Jan 24 00:44:58.002229 sshd-session[1849]: pam_unix(sshd:session): session closed for user core Jan 24 00:44:58.010379 systemd[1]: sshd@4-65.109.167.77:22-4.153.228.146:50534.service: Deactivated successfully. Jan 24 00:44:58.013758 systemd[1]: session-6.scope: Deactivated successfully. Jan 24 00:44:58.015367 systemd-logind[1656]: Session 6 logged out. Waiting for processes to exit. Jan 24 00:44:58.018493 systemd-logind[1656]: Removed session 6. Jan 24 00:44:58.137687 systemd[1]: Started sshd@5-65.109.167.77:22-4.153.228.146:50548.service - OpenSSH per-connection server daemon (4.153.228.146:50548). Jan 24 00:44:58.841132 sshd[1861]: Accepted publickey for core from 4.153.228.146 port 50548 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:44:58.843640 sshd-session[1861]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:44:58.854187 systemd-logind[1656]: New session 7 of user core. Jan 24 00:44:58.861394 systemd[1]: Started session-7.scope - Session 7 of User core. Jan 24 00:44:59.099021 sudo[1867]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Jan 24 00:44:59.099777 sudo[1867]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 24 00:44:59.103561 sudo[1867]: pam_unix(sudo:session): session closed for user root Jan 24 00:44:59.116589 sudo[1866]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Jan 24 00:44:59.117416 sudo[1866]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 24 00:44:59.129428 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 24 00:44:59.168000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Jan 24 00:44:59.171196 augenrules[1891]: No rules Jan 24 00:44:59.172100 kernel: kauditd_printk_skb: 135 callbacks suppressed Jan 24 00:44:59.172132 kernel: audit: type=1305 audit(1769215499.168:233): auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Jan 24 00:44:59.172648 systemd[1]: audit-rules.service: Deactivated successfully. Jan 24 00:44:59.173009 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 24 00:44:59.177306 sudo[1866]: pam_unix(sudo:session): session closed for user root Jan 24 00:44:59.168000 audit[1891]: SYSCALL arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7ffccf554ea0 a2=420 a3=0 items=0 ppid=1872 pid=1891 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:44:59.189503 kernel: audit: type=1300 audit(1769215499.168:233): arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7ffccf554ea0 a2=420 a3=0 items=0 ppid=1872 pid=1891 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:44:59.168000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 24 00:44:59.190129 kernel: audit: type=1327 audit(1769215499.168:233): proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 24 00:44:59.172000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:59.197257 kernel: audit: type=1130 audit(1769215499.172:234): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:59.197335 kernel: audit: type=1131 audit(1769215499.172:235): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:59.172000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:59.176000 audit[1866]: USER_END pid=1866 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 24 00:44:59.206730 kernel: audit: type=1106 audit(1769215499.176:236): pid=1866 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 24 00:44:59.206821 kernel: audit: type=1104 audit(1769215499.176:237): pid=1866 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 24 00:44:59.176000 audit[1866]: CRED_DISP pid=1866 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 24 00:44:59.300650 sshd[1865]: Connection closed by 4.153.228.146 port 50548 Jan 24 00:44:59.301272 sshd-session[1861]: pam_unix(sshd:session): session closed for user core Jan 24 00:44:59.302000 audit[1861]: USER_END pid=1861 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:44:59.307717 systemd[1]: sshd@5-65.109.167.77:22-4.153.228.146:50548.service: Deactivated successfully. Jan 24 00:44:59.309872 systemd[1]: session-7.scope: Deactivated successfully. Jan 24 00:44:59.312638 systemd-logind[1656]: Session 7 logged out. Waiting for processes to exit. Jan 24 00:44:59.313639 systemd-logind[1656]: Removed session 7. Jan 24 00:44:59.302000 audit[1861]: CRED_DISP pid=1861 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:44:59.317162 kernel: audit: type=1106 audit(1769215499.302:238): pid=1861 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:44:59.317188 kernel: audit: type=1104 audit(1769215499.302:239): pid=1861 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:44:59.307000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@5-65.109.167.77:22-4.153.228.146:50548 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:59.337307 kernel: audit: type=1131 audit(1769215499.307:240): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@5-65.109.167.77:22-4.153.228.146:50548 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:59.433000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-65.109.167.77:22-4.153.228.146:50562 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:44:59.434339 systemd[1]: Started sshd@6-65.109.167.77:22-4.153.228.146:50562.service - OpenSSH per-connection server daemon (4.153.228.146:50562). Jan 24 00:45:00.116000 audit[1900]: USER_ACCT pid=1900 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:45:00.117857 sshd[1900]: Accepted publickey for core from 4.153.228.146 port 50562 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:45:00.117000 audit[1900]: CRED_ACQ pid=1900 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:45:00.117000 audit[1900]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd56ff2d80 a2=3 a3=0 items=0 ppid=1 pid=1900 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=8 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:00.117000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:45:00.119592 sshd-session[1900]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:45:00.126979 systemd-logind[1656]: New session 8 of user core. Jan 24 00:45:00.133247 systemd[1]: Started session-8.scope - Session 8 of User core. Jan 24 00:45:00.135000 audit[1900]: USER_START pid=1900 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:45:00.138000 audit[1904]: CRED_ACQ pid=1904 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:45:00.369000 audit[1905]: USER_ACCT pid=1905 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_unix,pam_faillock acct="core" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 24 00:45:00.370783 sudo[1905]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Jan 24 00:45:00.369000 audit[1905]: CRED_REFR pid=1905 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 24 00:45:00.370000 audit[1905]: USER_START pid=1905 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 24 00:45:00.371290 sudo[1905]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 24 00:45:00.923930 systemd[1]: Starting docker.service - Docker Application Container Engine... Jan 24 00:45:00.936386 (dockerd)[1923]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Jan 24 00:45:01.366788 dockerd[1923]: time="2026-01-24T00:45:01.366119116Z" level=info msg="Starting up" Jan 24 00:45:01.369129 dockerd[1923]: time="2026-01-24T00:45:01.369099499Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Jan 24 00:45:01.390873 dockerd[1923]: time="2026-01-24T00:45:01.390809467Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Jan 24 00:45:01.445491 dockerd[1923]: time="2026-01-24T00:45:01.445457293Z" level=info msg="Loading containers: start." Jan 24 00:45:01.458099 kernel: Initializing XFRM netlink socket Jan 24 00:45:01.548000 audit[1972]: NETFILTER_CFG table=nat:2 family=2 entries=2 op=nft_register_chain pid=1972 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 24 00:45:01.548000 audit[1972]: SYSCALL arch=c000003e syscall=46 success=yes exit=116 a0=3 a1=7ffed9f70b60 a2=0 a3=0 items=0 ppid=1923 pid=1972 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:01.548000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Jan 24 00:45:01.553000 audit[1974]: NETFILTER_CFG table=filter:3 family=2 entries=2 op=nft_register_chain pid=1974 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 24 00:45:01.553000 audit[1974]: SYSCALL arch=c000003e syscall=46 success=yes exit=124 a0=3 a1=7ffe81f0a7e0 a2=0 a3=0 items=0 ppid=1923 pid=1974 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:01.553000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Jan 24 00:45:01.557000 audit[1976]: NETFILTER_CFG table=filter:4 family=2 entries=1 op=nft_register_chain pid=1976 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 24 00:45:01.557000 audit[1976]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffdc84f1d60 a2=0 a3=0 items=0 ppid=1923 pid=1976 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:01.557000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Jan 24 00:45:01.562000 audit[1978]: NETFILTER_CFG table=filter:5 family=2 entries=1 op=nft_register_chain pid=1978 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 24 00:45:01.562000 audit[1978]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffd4a40ea00 a2=0 a3=0 items=0 ppid=1923 pid=1978 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:01.562000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Jan 24 00:45:01.566000 audit[1980]: NETFILTER_CFG table=filter:6 family=2 entries=1 op=nft_register_chain pid=1980 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 24 00:45:01.566000 audit[1980]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffe0a8629e0 a2=0 a3=0 items=0 ppid=1923 pid=1980 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:01.566000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Jan 24 00:45:01.571000 audit[1982]: NETFILTER_CFG table=filter:7 family=2 entries=1 op=nft_register_chain pid=1982 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 24 00:45:01.571000 audit[1982]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7ffe7ef3dfa0 a2=0 a3=0 items=0 ppid=1923 pid=1982 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:01.571000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 24 00:45:01.575000 audit[1984]: NETFILTER_CFG table=filter:8 family=2 entries=1 op=nft_register_chain pid=1984 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 24 00:45:01.575000 audit[1984]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7ffd19fa9180 a2=0 a3=0 items=0 ppid=1923 pid=1984 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:01.575000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 24 00:45:01.580000 audit[1986]: NETFILTER_CFG table=nat:9 family=2 entries=2 op=nft_register_chain pid=1986 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 24 00:45:01.580000 audit[1986]: SYSCALL arch=c000003e syscall=46 success=yes exit=384 a0=3 a1=7ffcd14798e0 a2=0 a3=0 items=0 ppid=1923 pid=1986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:01.580000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Jan 24 00:45:01.625000 audit[1989]: NETFILTER_CFG table=nat:10 family=2 entries=2 op=nft_register_chain pid=1989 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 24 00:45:01.625000 audit[1989]: SYSCALL arch=c000003e syscall=46 success=yes exit=472 a0=3 a1=7ffd5d8b87e0 a2=0 a3=0 items=0 ppid=1923 pid=1989 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:01.625000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003132372E302E302E302F38 Jan 24 00:45:01.630000 audit[1991]: NETFILTER_CFG table=filter:11 family=2 entries=2 op=nft_register_chain pid=1991 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 24 00:45:01.630000 audit[1991]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7fffe39c6a50 a2=0 a3=0 items=0 ppid=1923 pid=1991 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:01.630000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Jan 24 00:45:01.635000 audit[1993]: NETFILTER_CFG table=filter:12 family=2 entries=1 op=nft_register_rule pid=1993 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 24 00:45:01.635000 audit[1993]: SYSCALL arch=c000003e syscall=46 success=yes exit=236 a0=3 a1=7fff91393170 a2=0 a3=0 items=0 ppid=1923 pid=1993 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:01.635000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Jan 24 00:45:01.640000 audit[1995]: NETFILTER_CFG table=filter:13 family=2 entries=1 op=nft_register_rule pid=1995 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 24 00:45:01.640000 audit[1995]: SYSCALL arch=c000003e syscall=46 success=yes exit=248 a0=3 a1=7ffca6350a40 a2=0 a3=0 items=0 ppid=1923 pid=1995 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:01.640000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 24 00:45:01.646000 audit[1997]: NETFILTER_CFG table=filter:14 family=2 entries=1 op=nft_register_rule pid=1997 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 24 00:45:01.646000 audit[1997]: SYSCALL arch=c000003e syscall=46 success=yes exit=232 a0=3 a1=7fffacc89de0 a2=0 a3=0 items=0 ppid=1923 pid=1997 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:01.646000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Jan 24 00:45:01.723000 audit[2027]: NETFILTER_CFG table=nat:15 family=10 entries=2 op=nft_register_chain pid=2027 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 24 00:45:01.723000 audit[2027]: SYSCALL arch=c000003e syscall=46 success=yes exit=116 a0=3 a1=7ffd0e8d3f00 a2=0 a3=0 items=0 ppid=1923 pid=2027 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:01.723000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Jan 24 00:45:01.729616 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Jan 24 00:45:01.729000 audit[2029]: NETFILTER_CFG table=filter:16 family=10 entries=2 op=nft_register_chain pid=2029 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 24 00:45:01.729000 audit[2029]: SYSCALL arch=c000003e syscall=46 success=yes exit=124 a0=3 a1=7ffffff38040 a2=0 a3=0 items=0 ppid=1923 pid=2029 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:01.729000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Jan 24 00:45:01.733425 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 24 00:45:01.735000 audit[2032]: NETFILTER_CFG table=filter:17 family=10 entries=1 op=nft_register_chain pid=2032 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 24 00:45:01.735000 audit[2032]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffd350b22d0 a2=0 a3=0 items=0 ppid=1923 pid=2032 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:01.735000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Jan 24 00:45:01.740000 audit[2034]: NETFILTER_CFG table=filter:18 family=10 entries=1 op=nft_register_chain pid=2034 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 24 00:45:01.740000 audit[2034]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc02d492a0 a2=0 a3=0 items=0 ppid=1923 pid=2034 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:01.740000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Jan 24 00:45:01.747000 audit[2038]: NETFILTER_CFG table=filter:19 family=10 entries=1 op=nft_register_chain pid=2038 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 24 00:45:01.747000 audit[2038]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7fffe4148310 a2=0 a3=0 items=0 ppid=1923 pid=2038 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:01.747000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Jan 24 00:45:01.751000 audit[2040]: NETFILTER_CFG table=filter:20 family=10 entries=1 op=nft_register_chain pid=2040 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 24 00:45:01.751000 audit[2040]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7ffeac2e9730 a2=0 a3=0 items=0 ppid=1923 pid=2040 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:01.751000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 24 00:45:01.755000 audit[2042]: NETFILTER_CFG table=filter:21 family=10 entries=1 op=nft_register_chain pid=2042 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 24 00:45:01.755000 audit[2042]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7ffe764d1780 a2=0 a3=0 items=0 ppid=1923 pid=2042 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:01.755000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 24 00:45:01.762000 audit[2044]: NETFILTER_CFG table=nat:22 family=10 entries=2 op=nft_register_chain pid=2044 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 24 00:45:01.762000 audit[2044]: SYSCALL arch=c000003e syscall=46 success=yes exit=384 a0=3 a1=7ffc3bad1b00 a2=0 a3=0 items=0 ppid=1923 pid=2044 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:01.762000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Jan 24 00:45:01.768000 audit[2046]: NETFILTER_CFG table=nat:23 family=10 entries=2 op=nft_register_chain pid=2046 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 24 00:45:01.768000 audit[2046]: SYSCALL arch=c000003e syscall=46 success=yes exit=484 a0=3 a1=7ffd8d317660 a2=0 a3=0 items=0 ppid=1923 pid=2046 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:01.768000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003A3A312F313238 Jan 24 00:45:01.774000 audit[2048]: NETFILTER_CFG table=filter:24 family=10 entries=2 op=nft_register_chain pid=2048 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 24 00:45:01.774000 audit[2048]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7ffe44ae7710 a2=0 a3=0 items=0 ppid=1923 pid=2048 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:01.774000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Jan 24 00:45:01.778000 audit[2050]: NETFILTER_CFG table=filter:25 family=10 entries=1 op=nft_register_rule pid=2050 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 24 00:45:01.778000 audit[2050]: SYSCALL arch=c000003e syscall=46 success=yes exit=236 a0=3 a1=7fff41abd5f0 a2=0 a3=0 items=0 ppid=1923 pid=2050 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:01.778000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Jan 24 00:45:01.782000 audit[2052]: NETFILTER_CFG table=filter:26 family=10 entries=1 op=nft_register_rule pid=2052 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 24 00:45:01.782000 audit[2052]: SYSCALL arch=c000003e syscall=46 success=yes exit=248 a0=3 a1=7fffd6549f30 a2=0 a3=0 items=0 ppid=1923 pid=2052 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:01.782000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 24 00:45:01.787000 audit[2054]: NETFILTER_CFG table=filter:27 family=10 entries=1 op=nft_register_rule pid=2054 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 24 00:45:01.787000 audit[2054]: SYSCALL arch=c000003e syscall=46 success=yes exit=232 a0=3 a1=7ffdd73778f0 a2=0 a3=0 items=0 ppid=1923 pid=2054 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:01.787000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Jan 24 00:45:01.798000 audit[2059]: NETFILTER_CFG table=filter:28 family=2 entries=1 op=nft_register_chain pid=2059 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 24 00:45:01.798000 audit[2059]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7fffb5b05740 a2=0 a3=0 items=0 ppid=1923 pid=2059 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:01.798000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Jan 24 00:45:01.801000 audit[2061]: NETFILTER_CFG table=filter:29 family=2 entries=1 op=nft_register_rule pid=2061 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 24 00:45:01.801000 audit[2061]: SYSCALL arch=c000003e syscall=46 success=yes exit=212 a0=3 a1=7fff4b629820 a2=0 a3=0 items=0 ppid=1923 pid=2061 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:01.801000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Jan 24 00:45:01.803000 audit[2063]: NETFILTER_CFG table=filter:30 family=2 entries=1 op=nft_register_rule pid=2063 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 24 00:45:01.803000 audit[2063]: SYSCALL arch=c000003e syscall=46 success=yes exit=224 a0=3 a1=7ffc06cc6b30 a2=0 a3=0 items=0 ppid=1923 pid=2063 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:01.803000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Jan 24 00:45:01.805000 audit[2065]: NETFILTER_CFG table=filter:31 family=10 entries=1 op=nft_register_chain pid=2065 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 24 00:45:01.805000 audit[2065]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffebcf2b970 a2=0 a3=0 items=0 ppid=1923 pid=2065 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:01.805000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Jan 24 00:45:01.807000 audit[2067]: NETFILTER_CFG table=filter:32 family=10 entries=1 op=nft_register_rule pid=2067 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 24 00:45:01.807000 audit[2067]: SYSCALL arch=c000003e syscall=46 success=yes exit=212 a0=3 a1=7ffc8e6030d0 a2=0 a3=0 items=0 ppid=1923 pid=2067 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:01.807000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Jan 24 00:45:01.809000 audit[2069]: NETFILTER_CFG table=filter:33 family=10 entries=1 op=nft_register_rule pid=2069 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 24 00:45:01.809000 audit[2069]: SYSCALL arch=c000003e syscall=46 success=yes exit=224 a0=3 a1=7ffdfa90edc0 a2=0 a3=0 items=0 ppid=1923 pid=2069 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:01.809000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Jan 24 00:45:01.844000 audit[2073]: NETFILTER_CFG table=nat:34 family=2 entries=2 op=nft_register_chain pid=2073 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 24 00:45:01.844000 audit[2073]: SYSCALL arch=c000003e syscall=46 success=yes exit=520 a0=3 a1=7ffdf75136e0 a2=0 a3=0 items=0 ppid=1923 pid=2073 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:01.844000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900504F5354524F5554494E47002D73003137322E31372E302E302F31360000002D6F00646F636B657230002D6A004D415351554552414445 Jan 24 00:45:01.847000 audit[2075]: NETFILTER_CFG table=nat:35 family=2 entries=1 op=nft_register_rule pid=2075 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 24 00:45:01.847000 audit[2075]: SYSCALL arch=c000003e syscall=46 success=yes exit=288 a0=3 a1=7ffc247e9270 a2=0 a3=0 items=0 ppid=1923 pid=2075 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:01.847000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900444F434B4552002D6900646F636B657230002D6A0052455455524E Jan 24 00:45:01.859000 audit[2083]: NETFILTER_CFG table=filter:36 family=2 entries=1 op=nft_register_rule pid=2083 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 24 00:45:01.859000 audit[2083]: SYSCALL arch=c000003e syscall=46 success=yes exit=300 a0=3 a1=7ffc1d827e30 a2=0 a3=0 items=0 ppid=1923 pid=2083 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:01.859000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D464F5257415244002D6900646F636B657230002D6A00414343455054 Jan 24 00:45:01.872000 audit[2089]: NETFILTER_CFG table=filter:37 family=2 entries=1 op=nft_register_rule pid=2089 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 24 00:45:01.872000 audit[2089]: SYSCALL arch=c000003e syscall=46 success=yes exit=376 a0=3 a1=7ffcdc207760 a2=0 a3=0 items=0 ppid=1923 pid=2089 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:01.872000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45520000002D6900646F636B657230002D6F00646F636B657230002D6A0044524F50 Jan 24 00:45:01.875000 audit[2091]: NETFILTER_CFG table=filter:38 family=2 entries=1 op=nft_register_rule pid=2091 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 24 00:45:01.875000 audit[2091]: SYSCALL arch=c000003e syscall=46 success=yes exit=512 a0=3 a1=7fffb6b6c400 a2=0 a3=0 items=0 ppid=1923 pid=2091 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:01.875000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D4354002D6F00646F636B657230002D6D00636F6E6E747261636B002D2D637473746174650052454C415445442C45535441424C4953484544002D6A00414343455054 Jan 24 00:45:01.878000 audit[2093]: NETFILTER_CFG table=filter:39 family=2 entries=1 op=nft_register_rule pid=2093 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 24 00:45:01.878000 audit[2093]: SYSCALL arch=c000003e syscall=46 success=yes exit=312 a0=3 a1=7ffeb096be60 a2=0 a3=0 items=0 ppid=1923 pid=2093 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:01.878000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D425249444745002D6F00646F636B657230002D6A00444F434B4552 Jan 24 00:45:01.883000 audit[2097]: NETFILTER_CFG table=filter:40 family=2 entries=1 op=nft_register_rule pid=2097 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 24 00:45:01.883000 audit[2097]: SYSCALL arch=c000003e syscall=46 success=yes exit=428 a0=3 a1=7ffca3dfef20 a2=0 a3=0 items=0 ppid=1923 pid=2097 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:01.883000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D49534F4C4154494F4E2D53544147452D31002D6900646F636B6572300000002D6F00646F636B657230002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 24 00:45:01.885000 audit[2099]: NETFILTER_CFG table=filter:41 family=2 entries=1 op=nft_register_rule pid=2099 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 24 00:45:01.885000 audit[2099]: SYSCALL arch=c000003e syscall=46 success=yes exit=312 a0=3 a1=7ffd43535420 a2=0 a3=0 items=0 ppid=1923 pid=2099 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:01.885000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4900444F434B45522D49534F4C4154494F4E2D53544147452D32002D6F00646F636B657230002D6A0044524F50 Jan 24 00:45:01.887146 systemd-networkd[1553]: docker0: Link UP Jan 24 00:45:01.890988 dockerd[1923]: time="2026-01-24T00:45:01.890948024Z" level=info msg="Loading containers: done." Jan 24 00:45:01.905654 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 24 00:45:01.905000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:45:01.910290 dockerd[1923]: time="2026-01-24T00:45:01.910245340Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Jan 24 00:45:01.910401 dockerd[1923]: time="2026-01-24T00:45:01.910342650Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Jan 24 00:45:01.910450 dockerd[1923]: time="2026-01-24T00:45:01.910431460Z" level=info msg="Initializing buildkit" Jan 24 00:45:01.913695 (kubelet)[2108]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 24 00:45:01.933228 dockerd[1923]: time="2026-01-24T00:45:01.933034979Z" level=info msg="Completed buildkit initialization" Jan 24 00:45:01.940082 dockerd[1923]: time="2026-01-24T00:45:01.939485014Z" level=info msg="Daemon has completed initialization" Jan 24 00:45:01.940082 dockerd[1923]: time="2026-01-24T00:45:01.939881964Z" level=info msg="API listen on /run/docker.sock" Jan 24 00:45:01.941101 systemd[1]: Started docker.service - Docker Application Container Engine. Jan 24 00:45:01.940000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=docker comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:45:01.959455 kubelet[2108]: E0124 00:45:01.959400 2108 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 24 00:45:01.963506 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 24 00:45:01.963660 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 24 00:45:01.963000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 24 00:45:01.964591 systemd[1]: kubelet.service: Consumed 170ms CPU time, 110.3M memory peak. Jan 24 00:45:02.936934 containerd[1682]: time="2026-01-24T00:45:02.936861995Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.7\"" Jan 24 00:45:03.569009 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1463581914.mount: Deactivated successfully. Jan 24 00:45:05.333783 containerd[1682]: time="2026-01-24T00:45:05.333727402Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 24 00:45:05.334932 containerd[1682]: time="2026-01-24T00:45:05.334800493Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.7: active requests=0, bytes read=29277594" Jan 24 00:45:05.335801 containerd[1682]: time="2026-01-24T00:45:05.335778133Z" level=info msg="ImageCreate event name:\"sha256:021d1ceeffb11df7a9fb9adfa0ad0a30dcd13cb3d630022066f184cdcb93731b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 24 00:45:05.337879 containerd[1682]: time="2026-01-24T00:45:05.337854445Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:9585226cb85d1dc0f0ef5f7a75f04e4bc91ddd82de249533bd293aa3cf958dab\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 24 00:45:05.338454 containerd[1682]: time="2026-01-24T00:45:05.338436496Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.7\" with image id \"sha256:021d1ceeffb11df7a9fb9adfa0ad0a30dcd13cb3d630022066f184cdcb93731b\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.7\", repo digest \"registry.k8s.io/kube-apiserver@sha256:9585226cb85d1dc0f0ef5f7a75f04e4bc91ddd82de249533bd293aa3cf958dab\", size \"30111311\" in 2.401526821s" Jan 24 00:45:05.338512 containerd[1682]: time="2026-01-24T00:45:05.338502446Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.7\" returns image reference \"sha256:021d1ceeffb11df7a9fb9adfa0ad0a30dcd13cb3d630022066f184cdcb93731b\"" Jan 24 00:45:05.339050 containerd[1682]: time="2026-01-24T00:45:05.339023776Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.7\"" Jan 24 00:45:07.616318 containerd[1682]: time="2026-01-24T00:45:07.616259803Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 24 00:45:07.617515 containerd[1682]: time="2026-01-24T00:45:07.617344464Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.7: active requests=0, bytes read=26008626" Jan 24 00:45:07.618412 containerd[1682]: time="2026-01-24T00:45:07.618390865Z" level=info msg="ImageCreate event name:\"sha256:29c7cab9d8e681d047281fd3711baf13c28f66923480fb11c8f22ddb7ca742d1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 24 00:45:07.620645 containerd[1682]: time="2026-01-24T00:45:07.620598567Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:f69d77ca0626b5a4b7b432c18de0952941181db7341c80eb89731f46d1d0c230\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 24 00:45:07.621277 containerd[1682]: time="2026-01-24T00:45:07.621260137Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.7\" with image id \"sha256:29c7cab9d8e681d047281fd3711baf13c28f66923480fb11c8f22ddb7ca742d1\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.7\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:f69d77ca0626b5a4b7b432c18de0952941181db7341c80eb89731f46d1d0c230\", size \"27673815\" in 2.282212331s" Jan 24 00:45:07.621329 containerd[1682]: time="2026-01-24T00:45:07.621319507Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.7\" returns image reference \"sha256:29c7cab9d8e681d047281fd3711baf13c28f66923480fb11c8f22ddb7ca742d1\"" Jan 24 00:45:07.621954 containerd[1682]: time="2026-01-24T00:45:07.621808858Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.7\"" Jan 24 00:45:08.944234 containerd[1682]: time="2026-01-24T00:45:08.944179719Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 24 00:45:08.945341 containerd[1682]: time="2026-01-24T00:45:08.945203830Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.7: active requests=0, bytes read=20149965" Jan 24 00:45:08.946160 containerd[1682]: time="2026-01-24T00:45:08.946137901Z" level=info msg="ImageCreate event name:\"sha256:f457f6fcd712acb5b9beef873f6f4a4869182f9eb52ea6e24824fd4ac4eed393\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 24 00:45:08.948353 containerd[1682]: time="2026-01-24T00:45:08.948326773Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:21bda321d8b4d48eb059fbc1593203d55d8b3bc7acd0584e04e55504796d78d0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 24 00:45:08.948992 containerd[1682]: time="2026-01-24T00:45:08.948967353Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.7\" with image id \"sha256:f457f6fcd712acb5b9beef873f6f4a4869182f9eb52ea6e24824fd4ac4eed393\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.7\", repo digest \"registry.k8s.io/kube-scheduler@sha256:21bda321d8b4d48eb059fbc1593203d55d8b3bc7acd0584e04e55504796d78d0\", size \"21815154\" in 1.327139085s" Jan 24 00:45:08.949070 containerd[1682]: time="2026-01-24T00:45:08.949047854Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.7\" returns image reference \"sha256:f457f6fcd712acb5b9beef873f6f4a4869182f9eb52ea6e24824fd4ac4eed393\"" Jan 24 00:45:08.949498 containerd[1682]: time="2026-01-24T00:45:08.949485954Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.7\"" Jan 24 00:45:10.071866 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2886639635.mount: Deactivated successfully. Jan 24 00:45:10.353087 containerd[1682]: time="2026-01-24T00:45:10.352975603Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 24 00:45:10.354118 containerd[1682]: time="2026-01-24T00:45:10.354097934Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.7: active requests=0, bytes read=31926374" Jan 24 00:45:10.355055 containerd[1682]: time="2026-01-24T00:45:10.355013065Z" level=info msg="ImageCreate event name:\"sha256:0929027b17fc30cb9de279f3bdba4e130b991a1dab7978a7db2e5feb2091853c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 24 00:45:10.356688 containerd[1682]: time="2026-01-24T00:45:10.356657396Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:ec25702b19026e9c0d339bc1c3bd231435a59f28b5fccb21e1b1078a357380f5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 24 00:45:10.357084 containerd[1682]: time="2026-01-24T00:45:10.356990276Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.7\" with image id \"sha256:0929027b17fc30cb9de279f3bdba4e130b991a1dab7978a7db2e5feb2091853c\", repo tag \"registry.k8s.io/kube-proxy:v1.33.7\", repo digest \"registry.k8s.io/kube-proxy@sha256:ec25702b19026e9c0d339bc1c3bd231435a59f28b5fccb21e1b1078a357380f5\", size \"31929115\" in 1.407440312s" Jan 24 00:45:10.357084 containerd[1682]: time="2026-01-24T00:45:10.357015476Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.7\" returns image reference \"sha256:0929027b17fc30cb9de279f3bdba4e130b991a1dab7978a7db2e5feb2091853c\"" Jan 24 00:45:10.357613 containerd[1682]: time="2026-01-24T00:45:10.357585067Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" Jan 24 00:45:10.879372 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4172786090.mount: Deactivated successfully. Jan 24 00:45:11.716001 containerd[1682]: time="2026-01-24T00:45:11.715947489Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 24 00:45:11.717175 containerd[1682]: time="2026-01-24T00:45:11.717082759Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=20128467" Jan 24 00:45:11.718024 containerd[1682]: time="2026-01-24T00:45:11.717979560Z" level=info msg="ImageCreate event name:\"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 24 00:45:11.720527 containerd[1682]: time="2026-01-24T00:45:11.720491212Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 24 00:45:11.721418 containerd[1682]: time="2026-01-24T00:45:11.721303173Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"20939036\" in 1.363695076s" Jan 24 00:45:11.721547 containerd[1682]: time="2026-01-24T00:45:11.721474893Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\"" Jan 24 00:45:11.722031 containerd[1682]: time="2026-01-24T00:45:11.721979894Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Jan 24 00:45:12.161568 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Jan 24 00:45:12.167017 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 24 00:45:12.181896 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4280872479.mount: Deactivated successfully. Jan 24 00:45:12.197326 containerd[1682]: time="2026-01-24T00:45:12.197259279Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 24 00:45:12.200666 containerd[1682]: time="2026-01-24T00:45:12.200265052Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Jan 24 00:45:12.202463 containerd[1682]: time="2026-01-24T00:45:12.202412084Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 24 00:45:12.206903 containerd[1682]: time="2026-01-24T00:45:12.206834877Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 24 00:45:12.210622 containerd[1682]: time="2026-01-24T00:45:12.210525951Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 488.514347ms" Jan 24 00:45:12.210812 containerd[1682]: time="2026-01-24T00:45:12.210787341Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Jan 24 00:45:12.212211 containerd[1682]: time="2026-01-24T00:45:12.212179392Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\"" Jan 24 00:45:12.386132 kernel: kauditd_printk_skb: 134 callbacks suppressed Jan 24 00:45:12.386265 kernel: audit: type=1130 audit(1769215512.382:293): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:45:12.382000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:45:12.383562 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 24 00:45:12.402793 (kubelet)[2287]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 24 00:45:12.484257 kubelet[2287]: E0124 00:45:12.484042 2287 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 24 00:45:12.491426 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 24 00:45:12.491826 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 24 00:45:12.491000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 24 00:45:12.492714 systemd[1]: kubelet.service: Consumed 249ms CPU time, 110.2M memory peak. Jan 24 00:45:12.504145 kernel: audit: type=1131 audit(1769215512.491:294): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 24 00:45:12.767641 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1690253054.mount: Deactivated successfully. Jan 24 00:45:14.762402 containerd[1682]: time="2026-01-24T00:45:14.762350086Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.21-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 24 00:45:14.763614 containerd[1682]: time="2026-01-24T00:45:14.763382217Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.21-0: active requests=0, bytes read=56977083" Jan 24 00:45:14.764500 containerd[1682]: time="2026-01-24T00:45:14.764478408Z" level=info msg="ImageCreate event name:\"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 24 00:45:14.766868 containerd[1682]: time="2026-01-24T00:45:14.766842040Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 24 00:45:14.767638 containerd[1682]: time="2026-01-24T00:45:14.767607581Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.21-0\" with image id \"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\", repo tag \"registry.k8s.io/etcd:3.5.21-0\", repo digest \"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\", size \"58938593\" in 2.555281419s" Jan 24 00:45:14.767681 containerd[1682]: time="2026-01-24T00:45:14.767641661Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\" returns image reference \"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\"" Jan 24 00:45:17.279000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:45:17.280855 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 24 00:45:17.281026 systemd[1]: kubelet.service: Consumed 249ms CPU time, 110.2M memory peak. Jan 24 00:45:17.288439 kernel: audit: type=1130 audit(1769215517.279:295): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:45:17.288502 kernel: audit: type=1131 audit(1769215517.279:296): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:45:17.279000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:45:17.286291 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 24 00:45:17.318574 systemd[1]: Reload requested from client PID 2379 ('systemctl') (unit session-8.scope)... Jan 24 00:45:17.318602 systemd[1]: Reloading... Jan 24 00:45:17.465099 zram_generator::config[2426]: No configuration found. Jan 24 00:45:17.631819 systemd[1]: Reloading finished in 312 ms. Jan 24 00:45:17.654127 kernel: audit: type=1334 audit(1769215517.650:297): prog-id=63 op=LOAD Jan 24 00:45:17.650000 audit: BPF prog-id=63 op=LOAD Jan 24 00:45:17.652000 audit: BPF prog-id=58 op=UNLOAD Jan 24 00:45:17.663687 kernel: audit: type=1334 audit(1769215517.652:298): prog-id=58 op=UNLOAD Jan 24 00:45:17.663775 kernel: audit: type=1334 audit(1769215517.653:299): prog-id=64 op=LOAD Jan 24 00:45:17.653000 audit: BPF prog-id=64 op=LOAD Jan 24 00:45:17.672083 kernel: audit: type=1334 audit(1769215517.653:300): prog-id=46 op=UNLOAD Jan 24 00:45:17.653000 audit: BPF prog-id=46 op=UNLOAD Jan 24 00:45:17.655000 audit: BPF prog-id=65 op=LOAD Jan 24 00:45:17.655000 audit: BPF prog-id=66 op=LOAD Jan 24 00:45:17.675898 kernel: audit: type=1334 audit(1769215517.655:301): prog-id=65 op=LOAD Jan 24 00:45:17.675979 kernel: audit: type=1334 audit(1769215517.655:302): prog-id=66 op=LOAD Jan 24 00:45:17.679116 kernel: audit: type=1334 audit(1769215517.655:303): prog-id=53 op=UNLOAD Jan 24 00:45:17.655000 audit: BPF prog-id=53 op=UNLOAD Jan 24 00:45:17.655000 audit: BPF prog-id=54 op=UNLOAD Jan 24 00:45:17.687133 kernel: audit: type=1334 audit(1769215517.655:304): prog-id=54 op=UNLOAD Jan 24 00:45:17.693153 kernel: audit: type=1334 audit(1769215517.655:305): prog-id=67 op=LOAD Jan 24 00:45:17.655000 audit: BPF prog-id=67 op=LOAD Jan 24 00:45:17.698266 kernel: audit: type=1334 audit(1769215517.655:306): prog-id=43 op=UNLOAD Jan 24 00:45:17.655000 audit: BPF prog-id=43 op=UNLOAD Jan 24 00:45:17.694787 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Jan 24 00:45:17.694863 systemd[1]: kubelet.service: Failed with result 'signal'. Jan 24 00:45:17.695137 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 24 00:45:17.695175 systemd[1]: kubelet.service: Consumed 133ms CPU time, 98.5M memory peak. Jan 24 00:45:17.697229 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 24 00:45:17.656000 audit: BPF prog-id=68 op=LOAD Jan 24 00:45:17.656000 audit: BPF prog-id=69 op=LOAD Jan 24 00:45:17.656000 audit: BPF prog-id=44 op=UNLOAD Jan 24 00:45:17.656000 audit: BPF prog-id=45 op=UNLOAD Jan 24 00:45:17.660000 audit: BPF prog-id=70 op=LOAD Jan 24 00:45:17.660000 audit: BPF prog-id=50 op=UNLOAD Jan 24 00:45:17.660000 audit: BPF prog-id=71 op=LOAD Jan 24 00:45:17.660000 audit: BPF prog-id=72 op=LOAD Jan 24 00:45:17.660000 audit: BPF prog-id=51 op=UNLOAD Jan 24 00:45:17.660000 audit: BPF prog-id=52 op=UNLOAD Jan 24 00:45:17.661000 audit: BPF prog-id=73 op=LOAD Jan 24 00:45:17.661000 audit: BPF prog-id=59 op=UNLOAD Jan 24 00:45:17.665000 audit: BPF prog-id=74 op=LOAD Jan 24 00:45:17.668000 audit: BPF prog-id=60 op=UNLOAD Jan 24 00:45:17.668000 audit: BPF prog-id=75 op=LOAD Jan 24 00:45:17.668000 audit: BPF prog-id=76 op=LOAD Jan 24 00:45:17.668000 audit: BPF prog-id=61 op=UNLOAD Jan 24 00:45:17.668000 audit: BPF prog-id=62 op=UNLOAD Jan 24 00:45:17.669000 audit: BPF prog-id=77 op=LOAD Jan 24 00:45:17.669000 audit: BPF prog-id=47 op=UNLOAD Jan 24 00:45:17.669000 audit: BPF prog-id=78 op=LOAD Jan 24 00:45:17.669000 audit: BPF prog-id=79 op=LOAD Jan 24 00:45:17.669000 audit: BPF prog-id=48 op=UNLOAD Jan 24 00:45:17.669000 audit: BPF prog-id=49 op=UNLOAD Jan 24 00:45:17.670000 audit: BPF prog-id=80 op=LOAD Jan 24 00:45:17.670000 audit: BPF prog-id=55 op=UNLOAD Jan 24 00:45:17.670000 audit: BPF prog-id=81 op=LOAD Jan 24 00:45:17.670000 audit: BPF prog-id=82 op=LOAD Jan 24 00:45:17.670000 audit: BPF prog-id=56 op=UNLOAD Jan 24 00:45:17.670000 audit: BPF prog-id=57 op=UNLOAD Jan 24 00:45:17.694000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 24 00:45:17.846748 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 24 00:45:17.849000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:45:17.853455 (kubelet)[2479]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 24 00:45:17.920272 kubelet[2479]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 24 00:45:17.920272 kubelet[2479]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jan 24 00:45:17.920272 kubelet[2479]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 24 00:45:17.920942 kubelet[2479]: I0124 00:45:17.920348 2479 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 24 00:45:18.239777 kubelet[2479]: I0124 00:45:18.239639 2479 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Jan 24 00:45:18.239777 kubelet[2479]: I0124 00:45:18.239661 2479 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 24 00:45:18.239951 kubelet[2479]: I0124 00:45:18.239824 2479 server.go:956] "Client rotation is on, will bootstrap in background" Jan 24 00:45:18.263108 kubelet[2479]: I0124 00:45:18.261550 2479 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 24 00:45:18.263992 kubelet[2479]: E0124 00:45:18.263549 2479 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://65.109.167.77:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 65.109.167.77:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Jan 24 00:45:18.268609 kubelet[2479]: I0124 00:45:18.268581 2479 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 24 00:45:18.279713 kubelet[2479]: I0124 00:45:18.279088 2479 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 24 00:45:18.279713 kubelet[2479]: I0124 00:45:18.279314 2479 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 24 00:45:18.279713 kubelet[2479]: I0124 00:45:18.279331 2479 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4593-0-0-9-1308b066bf","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 24 00:45:18.279713 kubelet[2479]: I0124 00:45:18.279564 2479 topology_manager.go:138] "Creating topology manager with none policy" Jan 24 00:45:18.279881 kubelet[2479]: I0124 00:45:18.279571 2479 container_manager_linux.go:303] "Creating device plugin manager" Jan 24 00:45:18.280507 kubelet[2479]: I0124 00:45:18.280496 2479 state_mem.go:36] "Initialized new in-memory state store" Jan 24 00:45:18.283209 kubelet[2479]: I0124 00:45:18.283195 2479 kubelet.go:480] "Attempting to sync node with API server" Jan 24 00:45:18.283260 kubelet[2479]: I0124 00:45:18.283254 2479 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 24 00:45:18.283301 kubelet[2479]: I0124 00:45:18.283295 2479 kubelet.go:386] "Adding apiserver pod source" Jan 24 00:45:18.283343 kubelet[2479]: I0124 00:45:18.283337 2479 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 24 00:45:18.293986 kubelet[2479]: E0124 00:45:18.293955 2479 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://65.109.167.77:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4593-0-0-9-1308b066bf&limit=500&resourceVersion=0\": dial tcp 65.109.167.77:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Jan 24 00:45:18.294374 kubelet[2479]: I0124 00:45:18.294364 2479 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Jan 24 00:45:18.294836 kubelet[2479]: I0124 00:45:18.294823 2479 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Jan 24 00:45:18.296108 kubelet[2479]: W0124 00:45:18.295947 2479 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Jan 24 00:45:18.296902 kubelet[2479]: E0124 00:45:18.296868 2479 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://65.109.167.77:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 65.109.167.77:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Jan 24 00:45:18.299248 kubelet[2479]: I0124 00:45:18.299236 2479 watchdog_linux.go:99] "Systemd watchdog is not enabled" Jan 24 00:45:18.299334 kubelet[2479]: I0124 00:45:18.299326 2479 server.go:1289] "Started kubelet" Jan 24 00:45:18.301572 kubelet[2479]: I0124 00:45:18.301560 2479 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 24 00:45:18.307598 kubelet[2479]: E0124 00:45:18.304645 2479 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://65.109.167.77:6443/api/v1/namespaces/default/events\": dial tcp 65.109.167.77:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4593-0-0-9-1308b066bf.188d8423e35693ab default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4593-0-0-9-1308b066bf,UID:ci-4593-0-0-9-1308b066bf,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4593-0-0-9-1308b066bf,},FirstTimestamp:2026-01-24 00:45:18.299296683 +0000 UTC m=+0.437743196,LastTimestamp:2026-01-24 00:45:18.299296683 +0000 UTC m=+0.437743196,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4593-0-0-9-1308b066bf,}" Jan 24 00:45:18.307986 kubelet[2479]: I0124 00:45:18.307946 2479 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Jan 24 00:45:18.311370 kubelet[2479]: I0124 00:45:18.311357 2479 server.go:317] "Adding debug handlers to kubelet server" Jan 24 00:45:18.314118 kubelet[2479]: I0124 00:45:18.314050 2479 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 24 00:45:18.314320 kubelet[2479]: I0124 00:45:18.314310 2479 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 24 00:45:18.314527 kubelet[2479]: I0124 00:45:18.314517 2479 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 24 00:45:18.317032 kubelet[2479]: E0124 00:45:18.317019 2479 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4593-0-0-9-1308b066bf\" not found" Jan 24 00:45:18.317538 kubelet[2479]: I0124 00:45:18.317528 2479 volume_manager.go:297] "Starting Kubelet Volume Manager" Jan 24 00:45:18.317804 kubelet[2479]: I0124 00:45:18.317795 2479 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Jan 24 00:45:18.317873 kubelet[2479]: I0124 00:45:18.317867 2479 reconciler.go:26] "Reconciler: start to sync state" Jan 24 00:45:18.318343 kubelet[2479]: E0124 00:45:18.318330 2479 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://65.109.167.77:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 65.109.167.77:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Jan 24 00:45:18.318556 kubelet[2479]: I0124 00:45:18.318543 2479 factory.go:223] Registration of the systemd container factory successfully Jan 24 00:45:18.318719 kubelet[2479]: I0124 00:45:18.318641 2479 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 24 00:45:18.319569 kubelet[2479]: E0124 00:45:18.319551 2479 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://65.109.167.77:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4593-0-0-9-1308b066bf?timeout=10s\": dial tcp 65.109.167.77:6443: connect: connection refused" interval="200ms" Jan 24 00:45:18.319718 kubelet[2479]: I0124 00:45:18.319708 2479 factory.go:223] Registration of the containerd container factory successfully Jan 24 00:45:18.320000 audit[2495]: NETFILTER_CFG table=mangle:42 family=10 entries=2 op=nft_register_chain pid=2495 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 24 00:45:18.322250 kubelet[2479]: E0124 00:45:18.322221 2479 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 24 00:45:18.320000 audit[2495]: SYSCALL arch=c000003e syscall=46 success=yes exit=136 a0=3 a1=7ffdca2246d0 a2=0 a3=0 items=0 ppid=2479 pid=2495 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:18.320000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Jan 24 00:45:18.328597 kubelet[2479]: I0124 00:45:18.328578 2479 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Jan 24 00:45:18.330000 audit[2500]: NETFILTER_CFG table=mangle:43 family=10 entries=1 op=nft_register_chain pid=2500 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 24 00:45:18.330000 audit[2500]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffc26589ba0 a2=0 a3=0 items=0 ppid=2479 pid=2500 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:18.330000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Jan 24 00:45:18.333025 kubelet[2479]: I0124 00:45:18.332998 2479 cpu_manager.go:221] "Starting CPU manager" policy="none" Jan 24 00:45:18.333025 kubelet[2479]: I0124 00:45:18.333011 2479 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jan 24 00:45:18.333025 kubelet[2479]: I0124 00:45:18.333032 2479 state_mem.go:36] "Initialized new in-memory state store" Jan 24 00:45:18.333000 audit[2498]: NETFILTER_CFG table=mangle:44 family=2 entries=2 op=nft_register_chain pid=2498 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 24 00:45:18.333000 audit[2498]: SYSCALL arch=c000003e syscall=46 success=yes exit=136 a0=3 a1=7ffdcdc5d080 a2=0 a3=0 items=0 ppid=2479 pid=2498 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:18.334885 kubelet[2479]: I0124 00:45:18.334843 2479 policy_none.go:49] "None policy: Start" Jan 24 00:45:18.334885 kubelet[2479]: I0124 00:45:18.334855 2479 memory_manager.go:186] "Starting memorymanager" policy="None" Jan 24 00:45:18.334885 kubelet[2479]: I0124 00:45:18.334864 2479 state_mem.go:35] "Initializing new in-memory state store" Jan 24 00:45:18.333000 audit[2502]: NETFILTER_CFG table=nat:45 family=10 entries=1 op=nft_register_chain pid=2502 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 24 00:45:18.333000 audit[2502]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff659c1400 a2=0 a3=0 items=0 ppid=2479 pid=2502 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:18.333000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Jan 24 00:45:18.333000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Jan 24 00:45:18.335000 audit[2504]: NETFILTER_CFG table=filter:46 family=10 entries=1 op=nft_register_chain pid=2504 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 24 00:45:18.335000 audit[2504]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffc7aa2b8e0 a2=0 a3=0 items=0 ppid=2479 pid=2504 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:18.335000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Jan 24 00:45:18.337000 audit[2503]: NETFILTER_CFG table=filter:47 family=2 entries=1 op=nft_register_chain pid=2503 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 24 00:45:18.337000 audit[2503]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffedff907c0 a2=0 a3=0 items=0 ppid=2479 pid=2503 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:18.337000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Jan 24 00:45:18.341125 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Jan 24 00:45:18.345000 audit[2506]: NETFILTER_CFG table=filter:48 family=2 entries=2 op=nft_register_chain pid=2506 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 24 00:45:18.345000 audit[2506]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7ffe20734970 a2=0 a3=0 items=0 ppid=2479 pid=2506 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:18.345000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 24 00:45:18.348949 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Jan 24 00:45:18.356045 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Jan 24 00:45:18.354000 audit[2508]: NETFILTER_CFG table=filter:49 family=2 entries=2 op=nft_register_chain pid=2508 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 24 00:45:18.354000 audit[2508]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7ffe58d43380 a2=0 a3=0 items=0 ppid=2479 pid=2508 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:18.354000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 24 00:45:18.362970 kubelet[2479]: E0124 00:45:18.362918 2479 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Jan 24 00:45:18.364552 kubelet[2479]: I0124 00:45:18.364521 2479 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 24 00:45:18.364552 kubelet[2479]: I0124 00:45:18.364536 2479 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 24 00:45:18.366598 kubelet[2479]: I0124 00:45:18.366578 2479 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 24 00:45:18.367282 kubelet[2479]: E0124 00:45:18.367256 2479 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jan 24 00:45:18.367282 kubelet[2479]: E0124 00:45:18.367285 2479 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4593-0-0-9-1308b066bf\" not found" Jan 24 00:45:18.374000 audit[2511]: NETFILTER_CFG table=filter:50 family=2 entries=1 op=nft_register_rule pid=2511 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 24 00:45:18.374000 audit[2511]: SYSCALL arch=c000003e syscall=46 success=yes exit=924 a0=3 a1=7fff08b10850 a2=0 a3=0 items=0 ppid=2479 pid=2511 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:18.374000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D41004B5542452D4649524557414C4C002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E7400626C6F636B20696E636F6D696E67206C6F63616C6E657420636F6E6E656374696F6E73002D2D647374003132372E302E302E302F38 Jan 24 00:45:18.376422 kubelet[2479]: I0124 00:45:18.376323 2479 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Jan 24 00:45:18.376422 kubelet[2479]: I0124 00:45:18.376415 2479 status_manager.go:230] "Starting to sync pod status with apiserver" Jan 24 00:45:18.376571 kubelet[2479]: I0124 00:45:18.376454 2479 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jan 24 00:45:18.376571 kubelet[2479]: I0124 00:45:18.376460 2479 kubelet.go:2436] "Starting kubelet main sync loop" Jan 24 00:45:18.376571 kubelet[2479]: E0124 00:45:18.376491 2479 kubelet.go:2460] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Jan 24 00:45:18.377250 kubelet[2479]: E0124 00:45:18.377216 2479 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://65.109.167.77:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 65.109.167.77:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Jan 24 00:45:18.377000 audit[2512]: NETFILTER_CFG table=mangle:51 family=2 entries=1 op=nft_register_chain pid=2512 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 24 00:45:18.377000 audit[2512]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7fffabef59e0 a2=0 a3=0 items=0 ppid=2479 pid=2512 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:18.377000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Jan 24 00:45:18.378000 audit[2514]: NETFILTER_CFG table=nat:52 family=2 entries=1 op=nft_register_chain pid=2514 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 24 00:45:18.378000 audit[2514]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffdfa8ed9b0 a2=0 a3=0 items=0 ppid=2479 pid=2514 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:18.378000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Jan 24 00:45:18.379000 audit[2515]: NETFILTER_CFG table=filter:53 family=2 entries=1 op=nft_register_chain pid=2515 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 24 00:45:18.379000 audit[2515]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffd83460010 a2=0 a3=0 items=0 ppid=2479 pid=2515 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:18.379000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Jan 24 00:45:18.468303 kubelet[2479]: I0124 00:45:18.468217 2479 kubelet_node_status.go:75] "Attempting to register node" node="ci-4593-0-0-9-1308b066bf" Jan 24 00:45:18.468865 kubelet[2479]: E0124 00:45:18.468815 2479 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://65.109.167.77:6443/api/v1/nodes\": dial tcp 65.109.167.77:6443: connect: connection refused" node="ci-4593-0-0-9-1308b066bf" Jan 24 00:45:18.498523 systemd[1]: Created slice kubepods-burstable-podfb10a41fc95ee03e4fcf793bdb671159.slice - libcontainer container kubepods-burstable-podfb10a41fc95ee03e4fcf793bdb671159.slice. Jan 24 00:45:18.516528 kubelet[2479]: E0124 00:45:18.516493 2479 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4593-0-0-9-1308b066bf\" not found" node="ci-4593-0-0-9-1308b066bf" Jan 24 00:45:18.519482 kubelet[2479]: I0124 00:45:18.519344 2479 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/fb10a41fc95ee03e4fcf793bdb671159-k8s-certs\") pod \"kube-apiserver-ci-4593-0-0-9-1308b066bf\" (UID: \"fb10a41fc95ee03e4fcf793bdb671159\") " pod="kube-system/kube-apiserver-ci-4593-0-0-9-1308b066bf" Jan 24 00:45:18.519482 kubelet[2479]: I0124 00:45:18.519425 2479 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/fb10a41fc95ee03e4fcf793bdb671159-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4593-0-0-9-1308b066bf\" (UID: \"fb10a41fc95ee03e4fcf793bdb671159\") " pod="kube-system/kube-apiserver-ci-4593-0-0-9-1308b066bf" Jan 24 00:45:18.519719 systemd[1]: Created slice kubepods-burstable-pod3f86d978a1c21216234796498708a4cc.slice - libcontainer container kubepods-burstable-pod3f86d978a1c21216234796498708a4cc.slice. Jan 24 00:45:18.521241 kubelet[2479]: I0124 00:45:18.519736 2479 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/3f86d978a1c21216234796498708a4cc-ca-certs\") pod \"kube-controller-manager-ci-4593-0-0-9-1308b066bf\" (UID: \"3f86d978a1c21216234796498708a4cc\") " pod="kube-system/kube-controller-manager-ci-4593-0-0-9-1308b066bf" Jan 24 00:45:18.521241 kubelet[2479]: I0124 00:45:18.519765 2479 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/3f86d978a1c21216234796498708a4cc-flexvolume-dir\") pod \"kube-controller-manager-ci-4593-0-0-9-1308b066bf\" (UID: \"3f86d978a1c21216234796498708a4cc\") " pod="kube-system/kube-controller-manager-ci-4593-0-0-9-1308b066bf" Jan 24 00:45:18.521241 kubelet[2479]: I0124 00:45:18.520744 2479 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/fb10a41fc95ee03e4fcf793bdb671159-ca-certs\") pod \"kube-apiserver-ci-4593-0-0-9-1308b066bf\" (UID: \"fb10a41fc95ee03e4fcf793bdb671159\") " pod="kube-system/kube-apiserver-ci-4593-0-0-9-1308b066bf" Jan 24 00:45:18.521590 kubelet[2479]: E0124 00:45:18.521550 2479 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://65.109.167.77:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4593-0-0-9-1308b066bf?timeout=10s\": dial tcp 65.109.167.77:6443: connect: connection refused" interval="400ms" Jan 24 00:45:18.526251 kubelet[2479]: E0124 00:45:18.526222 2479 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4593-0-0-9-1308b066bf\" not found" node="ci-4593-0-0-9-1308b066bf" Jan 24 00:45:18.530504 systemd[1]: Created slice kubepods-burstable-pod032bbb787b1aa2baa92b96ada46e1f9a.slice - libcontainer container kubepods-burstable-pod032bbb787b1aa2baa92b96ada46e1f9a.slice. Jan 24 00:45:18.534504 kubelet[2479]: E0124 00:45:18.534454 2479 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4593-0-0-9-1308b066bf\" not found" node="ci-4593-0-0-9-1308b066bf" Jan 24 00:45:18.621532 kubelet[2479]: I0124 00:45:18.620984 2479 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/3f86d978a1c21216234796498708a4cc-k8s-certs\") pod \"kube-controller-manager-ci-4593-0-0-9-1308b066bf\" (UID: \"3f86d978a1c21216234796498708a4cc\") " pod="kube-system/kube-controller-manager-ci-4593-0-0-9-1308b066bf" Jan 24 00:45:18.621532 kubelet[2479]: I0124 00:45:18.621144 2479 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/3f86d978a1c21216234796498708a4cc-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4593-0-0-9-1308b066bf\" (UID: \"3f86d978a1c21216234796498708a4cc\") " pod="kube-system/kube-controller-manager-ci-4593-0-0-9-1308b066bf" Jan 24 00:45:18.621532 kubelet[2479]: I0124 00:45:18.621204 2479 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/032bbb787b1aa2baa92b96ada46e1f9a-kubeconfig\") pod \"kube-scheduler-ci-4593-0-0-9-1308b066bf\" (UID: \"032bbb787b1aa2baa92b96ada46e1f9a\") " pod="kube-system/kube-scheduler-ci-4593-0-0-9-1308b066bf" Jan 24 00:45:18.621532 kubelet[2479]: I0124 00:45:18.621276 2479 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/3f86d978a1c21216234796498708a4cc-kubeconfig\") pod \"kube-controller-manager-ci-4593-0-0-9-1308b066bf\" (UID: \"3f86d978a1c21216234796498708a4cc\") " pod="kube-system/kube-controller-manager-ci-4593-0-0-9-1308b066bf" Jan 24 00:45:18.672320 kubelet[2479]: I0124 00:45:18.672239 2479 kubelet_node_status.go:75] "Attempting to register node" node="ci-4593-0-0-9-1308b066bf" Jan 24 00:45:18.672962 kubelet[2479]: E0124 00:45:18.672866 2479 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://65.109.167.77:6443/api/v1/nodes\": dial tcp 65.109.167.77:6443: connect: connection refused" node="ci-4593-0-0-9-1308b066bf" Jan 24 00:45:18.819184 containerd[1682]: time="2026-01-24T00:45:18.818856206Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4593-0-0-9-1308b066bf,Uid:fb10a41fc95ee03e4fcf793bdb671159,Namespace:kube-system,Attempt:0,}" Jan 24 00:45:18.828651 containerd[1682]: time="2026-01-24T00:45:18.828401544Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4593-0-0-9-1308b066bf,Uid:3f86d978a1c21216234796498708a4cc,Namespace:kube-system,Attempt:0,}" Jan 24 00:45:18.836725 containerd[1682]: time="2026-01-24T00:45:18.836662391Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4593-0-0-9-1308b066bf,Uid:032bbb787b1aa2baa92b96ada46e1f9a,Namespace:kube-system,Attempt:0,}" Jan 24 00:45:18.879505 containerd[1682]: time="2026-01-24T00:45:18.879396856Z" level=info msg="connecting to shim 4dfe6ad20a6e7eb089b21391a61b294b137da1e41cd41bf1de11588cd19c0201" address="unix:///run/containerd/s/9979f7af94fed64028fbe8c1a14a350b4fea4245925ae1cc16211b08ed3435c9" namespace=k8s.io protocol=ttrpc version=3 Jan 24 00:45:18.890107 containerd[1682]: time="2026-01-24T00:45:18.889347524Z" level=info msg="connecting to shim ee137d8e0ed0b974e1f268f0cc9485fff902e487d3e0090d2f94923889985059" address="unix:///run/containerd/s/ebf551b31e5c9b711e8b691f6e6f8b0fc219e0a6d3842e59f3ef4958802d1dfe" namespace=k8s.io protocol=ttrpc version=3 Jan 24 00:45:18.915874 containerd[1682]: time="2026-01-24T00:45:18.915801866Z" level=info msg="connecting to shim 1e4853a363e27e94a2347401150b690736121ddb4a32eb68c5a0f10638226f15" address="unix:///run/containerd/s/0249266bdae7f393ec8080b59834f0b7f8943ea10ac9194187eac7ebc74dfdb6" namespace=k8s.io protocol=ttrpc version=3 Jan 24 00:45:18.927497 kubelet[2479]: E0124 00:45:18.927400 2479 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://65.109.167.77:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4593-0-0-9-1308b066bf?timeout=10s\": dial tcp 65.109.167.77:6443: connect: connection refused" interval="800ms" Jan 24 00:45:18.962456 systemd[1]: Started cri-containerd-ee137d8e0ed0b974e1f268f0cc9485fff902e487d3e0090d2f94923889985059.scope - libcontainer container ee137d8e0ed0b974e1f268f0cc9485fff902e487d3e0090d2f94923889985059. Jan 24 00:45:18.970047 systemd[1]: Started cri-containerd-4dfe6ad20a6e7eb089b21391a61b294b137da1e41cd41bf1de11588cd19c0201.scope - libcontainer container 4dfe6ad20a6e7eb089b21391a61b294b137da1e41cd41bf1de11588cd19c0201. Jan 24 00:45:18.981606 systemd[1]: Started cri-containerd-1e4853a363e27e94a2347401150b690736121ddb4a32eb68c5a0f10638226f15.scope - libcontainer container 1e4853a363e27e94a2347401150b690736121ddb4a32eb68c5a0f10638226f15. Jan 24 00:45:18.985000 audit: BPF prog-id=83 op=LOAD Jan 24 00:45:18.986000 audit: BPF prog-id=84 op=LOAD Jan 24 00:45:18.986000 audit[2564]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00018c238 a2=98 a3=0 items=0 ppid=2533 pid=2564 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:18.986000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6565313337643865306564306239373465316632363866306363393438 Jan 24 00:45:18.987000 audit: BPF prog-id=84 op=UNLOAD Jan 24 00:45:18.987000 audit[2564]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2533 pid=2564 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:18.987000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6565313337643865306564306239373465316632363866306363393438 Jan 24 00:45:18.987000 audit: BPF prog-id=85 op=LOAD Jan 24 00:45:18.987000 audit[2564]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00018c488 a2=98 a3=0 items=0 ppid=2533 pid=2564 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:18.987000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6565313337643865306564306239373465316632363866306363393438 Jan 24 00:45:18.987000 audit: BPF prog-id=86 op=LOAD Jan 24 00:45:18.987000 audit[2564]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00018c218 a2=98 a3=0 items=0 ppid=2533 pid=2564 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:18.987000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6565313337643865306564306239373465316632363866306363393438 Jan 24 00:45:18.987000 audit: BPF prog-id=86 op=UNLOAD Jan 24 00:45:18.987000 audit[2564]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2533 pid=2564 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:18.987000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6565313337643865306564306239373465316632363866306363393438 Jan 24 00:45:18.987000 audit: BPF prog-id=85 op=UNLOAD Jan 24 00:45:18.987000 audit[2564]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2533 pid=2564 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:18.987000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6565313337643865306564306239373465316632363866306363393438 Jan 24 00:45:18.987000 audit: BPF prog-id=87 op=LOAD Jan 24 00:45:18.987000 audit[2564]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00018c6e8 a2=98 a3=0 items=0 ppid=2533 pid=2564 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:18.987000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6565313337643865306564306239373465316632363866306363393438 Jan 24 00:45:19.000000 audit: BPF prog-id=88 op=LOAD Jan 24 00:45:19.003000 audit: BPF prog-id=89 op=LOAD Jan 24 00:45:19.003000 audit[2600]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=2562 pid=2600 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:19.003000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3165343835336133363365323765393461323334373430313135306236 Jan 24 00:45:19.003000 audit: BPF prog-id=89 op=UNLOAD Jan 24 00:45:19.003000 audit[2600]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2562 pid=2600 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:19.003000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3165343835336133363365323765393461323334373430313135306236 Jan 24 00:45:19.003000 audit: BPF prog-id=90 op=LOAD Jan 24 00:45:19.003000 audit[2600]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=2562 pid=2600 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:19.003000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3165343835336133363365323765393461323334373430313135306236 Jan 24 00:45:19.003000 audit: BPF prog-id=91 op=LOAD Jan 24 00:45:19.003000 audit[2600]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=2562 pid=2600 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:19.003000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3165343835336133363365323765393461323334373430313135306236 Jan 24 00:45:19.003000 audit: BPF prog-id=91 op=UNLOAD Jan 24 00:45:19.003000 audit[2600]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2562 pid=2600 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:19.003000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3165343835336133363365323765393461323334373430313135306236 Jan 24 00:45:19.003000 audit: BPF prog-id=90 op=UNLOAD Jan 24 00:45:19.003000 audit[2600]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2562 pid=2600 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:19.003000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3165343835336133363365323765393461323334373430313135306236 Jan 24 00:45:19.003000 audit: BPF prog-id=92 op=LOAD Jan 24 00:45:19.003000 audit[2600]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=2562 pid=2600 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:19.003000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3165343835336133363365323765393461323334373430313135306236 Jan 24 00:45:19.004000 audit: BPF prog-id=93 op=LOAD Jan 24 00:45:19.005000 audit: BPF prog-id=94 op=LOAD Jan 24 00:45:19.005000 audit[2566]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=2525 pid=2566 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:19.005000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3464666536616432306136653765623038396232313339316136316232 Jan 24 00:45:19.005000 audit: BPF prog-id=94 op=UNLOAD Jan 24 00:45:19.005000 audit[2566]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2525 pid=2566 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:19.005000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3464666536616432306136653765623038396232313339316136316232 Jan 24 00:45:19.006000 audit: BPF prog-id=95 op=LOAD Jan 24 00:45:19.006000 audit[2566]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=2525 pid=2566 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:19.006000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3464666536616432306136653765623038396232313339316136316232 Jan 24 00:45:19.006000 audit: BPF prog-id=96 op=LOAD Jan 24 00:45:19.006000 audit[2566]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=2525 pid=2566 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:19.006000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3464666536616432306136653765623038396232313339316136316232 Jan 24 00:45:19.007000 audit: BPF prog-id=96 op=UNLOAD Jan 24 00:45:19.007000 audit[2566]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2525 pid=2566 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:19.007000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3464666536616432306136653765623038396232313339316136316232 Jan 24 00:45:19.007000 audit: BPF prog-id=95 op=UNLOAD Jan 24 00:45:19.007000 audit[2566]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2525 pid=2566 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:19.007000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3464666536616432306136653765623038396232313339316136316232 Jan 24 00:45:19.007000 audit: BPF prog-id=97 op=LOAD Jan 24 00:45:19.007000 audit[2566]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=2525 pid=2566 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:19.007000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3464666536616432306136653765623038396232313339316136316232 Jan 24 00:45:19.042701 containerd[1682]: time="2026-01-24T00:45:19.042663972Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4593-0-0-9-1308b066bf,Uid:032bbb787b1aa2baa92b96ada46e1f9a,Namespace:kube-system,Attempt:0,} returns sandbox id \"ee137d8e0ed0b974e1f268f0cc9485fff902e487d3e0090d2f94923889985059\"" Jan 24 00:45:19.053033 containerd[1682]: time="2026-01-24T00:45:19.052996701Z" level=info msg="CreateContainer within sandbox \"ee137d8e0ed0b974e1f268f0cc9485fff902e487d3e0090d2f94923889985059\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Jan 24 00:45:19.066808 containerd[1682]: time="2026-01-24T00:45:19.066281642Z" level=info msg="Container 5e403805f2b37e6beed50b73ff4fa198fe796cdc3b3127bd77c823c5105a34ff: CDI devices from CRI Config.CDIDevices: []" Jan 24 00:45:19.074833 containerd[1682]: time="2026-01-24T00:45:19.073466128Z" level=info msg="CreateContainer within sandbox \"ee137d8e0ed0b974e1f268f0cc9485fff902e487d3e0090d2f94923889985059\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"5e403805f2b37e6beed50b73ff4fa198fe796cdc3b3127bd77c823c5105a34ff\"" Jan 24 00:45:19.075590 containerd[1682]: time="2026-01-24T00:45:19.074246348Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4593-0-0-9-1308b066bf,Uid:3f86d978a1c21216234796498708a4cc,Namespace:kube-system,Attempt:0,} returns sandbox id \"1e4853a363e27e94a2347401150b690736121ddb4a32eb68c5a0f10638226f15\"" Jan 24 00:45:19.075590 containerd[1682]: time="2026-01-24T00:45:19.074253238Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4593-0-0-9-1308b066bf,Uid:fb10a41fc95ee03e4fcf793bdb671159,Namespace:kube-system,Attempt:0,} returns sandbox id \"4dfe6ad20a6e7eb089b21391a61b294b137da1e41cd41bf1de11588cd19c0201\"" Jan 24 00:45:19.075991 containerd[1682]: time="2026-01-24T00:45:19.075957900Z" level=info msg="StartContainer for \"5e403805f2b37e6beed50b73ff4fa198fe796cdc3b3127bd77c823c5105a34ff\"" Jan 24 00:45:19.076434 kubelet[2479]: I0124 00:45:19.076341 2479 kubelet_node_status.go:75] "Attempting to register node" node="ci-4593-0-0-9-1308b066bf" Jan 24 00:45:19.076725 kubelet[2479]: E0124 00:45:19.076711 2479 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://65.109.167.77:6443/api/v1/nodes\": dial tcp 65.109.167.77:6443: connect: connection refused" node="ci-4593-0-0-9-1308b066bf" Jan 24 00:45:19.077362 containerd[1682]: time="2026-01-24T00:45:19.077334491Z" level=info msg="connecting to shim 5e403805f2b37e6beed50b73ff4fa198fe796cdc3b3127bd77c823c5105a34ff" address="unix:///run/containerd/s/ebf551b31e5c9b711e8b691f6e6f8b0fc219e0a6d3842e59f3ef4958802d1dfe" protocol=ttrpc version=3 Jan 24 00:45:19.080614 containerd[1682]: time="2026-01-24T00:45:19.080488484Z" level=info msg="CreateContainer within sandbox \"4dfe6ad20a6e7eb089b21391a61b294b137da1e41cd41bf1de11588cd19c0201\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Jan 24 00:45:19.087419 containerd[1682]: time="2026-01-24T00:45:19.087353989Z" level=info msg="Container 99f2a910656beb11375fc9a8f310ae1f10290d2aaad6ad995ed81d06bd4375ad: CDI devices from CRI Config.CDIDevices: []" Jan 24 00:45:19.095230 containerd[1682]: time="2026-01-24T00:45:19.095208896Z" level=info msg="CreateContainer within sandbox \"1e4853a363e27e94a2347401150b690736121ddb4a32eb68c5a0f10638226f15\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Jan 24 00:45:19.099226 systemd[1]: Started cri-containerd-5e403805f2b37e6beed50b73ff4fa198fe796cdc3b3127bd77c823c5105a34ff.scope - libcontainer container 5e403805f2b37e6beed50b73ff4fa198fe796cdc3b3127bd77c823c5105a34ff. Jan 24 00:45:19.100456 containerd[1682]: time="2026-01-24T00:45:19.100429570Z" level=info msg="CreateContainer within sandbox \"4dfe6ad20a6e7eb089b21391a61b294b137da1e41cd41bf1de11588cd19c0201\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"99f2a910656beb11375fc9a8f310ae1f10290d2aaad6ad995ed81d06bd4375ad\"" Jan 24 00:45:19.101186 containerd[1682]: time="2026-01-24T00:45:19.101156141Z" level=info msg="StartContainer for \"99f2a910656beb11375fc9a8f310ae1f10290d2aaad6ad995ed81d06bd4375ad\"" Jan 24 00:45:19.102726 containerd[1682]: time="2026-01-24T00:45:19.102653672Z" level=info msg="connecting to shim 99f2a910656beb11375fc9a8f310ae1f10290d2aaad6ad995ed81d06bd4375ad" address="unix:///run/containerd/s/9979f7af94fed64028fbe8c1a14a350b4fea4245925ae1cc16211b08ed3435c9" protocol=ttrpc version=3 Jan 24 00:45:19.109227 containerd[1682]: time="2026-01-24T00:45:19.109197858Z" level=info msg="Container bd71b9b62407b31439937f713f0c85f23cbb38b07cfc69f4ee61abd6c124d8ad: CDI devices from CRI Config.CDIDevices: []" Jan 24 00:45:19.114557 containerd[1682]: time="2026-01-24T00:45:19.114524492Z" level=info msg="CreateContainer within sandbox \"1e4853a363e27e94a2347401150b690736121ddb4a32eb68c5a0f10638226f15\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"bd71b9b62407b31439937f713f0c85f23cbb38b07cfc69f4ee61abd6c124d8ad\"" Jan 24 00:45:19.118242 containerd[1682]: time="2026-01-24T00:45:19.118037485Z" level=info msg="StartContainer for \"bd71b9b62407b31439937f713f0c85f23cbb38b07cfc69f4ee61abd6c124d8ad\"" Jan 24 00:45:19.119000 audit: BPF prog-id=98 op=LOAD Jan 24 00:45:19.120000 audit: BPF prog-id=99 op=LOAD Jan 24 00:45:19.120000 audit[2654]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=2533 pid=2654 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:19.120000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3565343033383035663262333765366265656435306237336666346661 Jan 24 00:45:19.120000 audit: BPF prog-id=99 op=UNLOAD Jan 24 00:45:19.120000 audit[2654]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2533 pid=2654 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:19.120000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3565343033383035663262333765366265656435306237336666346661 Jan 24 00:45:19.120000 audit: BPF prog-id=100 op=LOAD Jan 24 00:45:19.120000 audit[2654]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=2533 pid=2654 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:19.120000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3565343033383035663262333765366265656435306237336666346661 Jan 24 00:45:19.120000 audit: BPF prog-id=101 op=LOAD Jan 24 00:45:19.120000 audit[2654]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=2533 pid=2654 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:19.120000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3565343033383035663262333765366265656435306237336666346661 Jan 24 00:45:19.120000 audit: BPF prog-id=101 op=UNLOAD Jan 24 00:45:19.120000 audit[2654]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2533 pid=2654 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:19.120000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3565343033383035663262333765366265656435306237336666346661 Jan 24 00:45:19.120000 audit: BPF prog-id=100 op=UNLOAD Jan 24 00:45:19.120000 audit[2654]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2533 pid=2654 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:19.120000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3565343033383035663262333765366265656435306237336666346661 Jan 24 00:45:19.120000 audit: BPF prog-id=102 op=LOAD Jan 24 00:45:19.120000 audit[2654]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=2533 pid=2654 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:19.120000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3565343033383035663262333765366265656435306237336666346661 Jan 24 00:45:19.124568 containerd[1682]: time="2026-01-24T00:45:19.124389760Z" level=info msg="connecting to shim bd71b9b62407b31439937f713f0c85f23cbb38b07cfc69f4ee61abd6c124d8ad" address="unix:///run/containerd/s/0249266bdae7f393ec8080b59834f0b7f8943ea10ac9194187eac7ebc74dfdb6" protocol=ttrpc version=3 Jan 24 00:45:19.131303 systemd[1]: Started cri-containerd-99f2a910656beb11375fc9a8f310ae1f10290d2aaad6ad995ed81d06bd4375ad.scope - libcontainer container 99f2a910656beb11375fc9a8f310ae1f10290d2aaad6ad995ed81d06bd4375ad. Jan 24 00:45:19.152220 systemd[1]: Started cri-containerd-bd71b9b62407b31439937f713f0c85f23cbb38b07cfc69f4ee61abd6c124d8ad.scope - libcontainer container bd71b9b62407b31439937f713f0c85f23cbb38b07cfc69f4ee61abd6c124d8ad. Jan 24 00:45:19.159000 audit: BPF prog-id=103 op=LOAD Jan 24 00:45:19.160000 audit: BPF prog-id=104 op=LOAD Jan 24 00:45:19.160000 audit[2677]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b0238 a2=98 a3=0 items=0 ppid=2525 pid=2677 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:19.160000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3939663261393130363536626562313133373566633961386633313061 Jan 24 00:45:19.160000 audit: BPF prog-id=104 op=UNLOAD Jan 24 00:45:19.160000 audit[2677]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2525 pid=2677 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:19.160000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3939663261393130363536626562313133373566633961386633313061 Jan 24 00:45:19.161000 audit: BPF prog-id=105 op=LOAD Jan 24 00:45:19.161000 audit[2677]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b0488 a2=98 a3=0 items=0 ppid=2525 pid=2677 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:19.161000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3939663261393130363536626562313133373566633961386633313061 Jan 24 00:45:19.161000 audit: BPF prog-id=106 op=LOAD Jan 24 00:45:19.161000 audit[2677]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001b0218 a2=98 a3=0 items=0 ppid=2525 pid=2677 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:19.161000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3939663261393130363536626562313133373566633961386633313061 Jan 24 00:45:19.161000 audit: BPF prog-id=106 op=UNLOAD Jan 24 00:45:19.161000 audit[2677]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2525 pid=2677 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:19.161000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3939663261393130363536626562313133373566633961386633313061 Jan 24 00:45:19.161000 audit: BPF prog-id=105 op=UNLOAD Jan 24 00:45:19.161000 audit[2677]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2525 pid=2677 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:19.161000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3939663261393130363536626562313133373566633961386633313061 Jan 24 00:45:19.161000 audit: BPF prog-id=107 op=LOAD Jan 24 00:45:19.161000 audit[2677]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b06e8 a2=98 a3=0 items=0 ppid=2525 pid=2677 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:19.161000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3939663261393130363536626562313133373566633961386633313061 Jan 24 00:45:19.169801 containerd[1682]: time="2026-01-24T00:45:19.169705018Z" level=info msg="StartContainer for \"5e403805f2b37e6beed50b73ff4fa198fe796cdc3b3127bd77c823c5105a34ff\" returns successfully" Jan 24 00:45:19.186000 audit: BPF prog-id=108 op=LOAD Jan 24 00:45:19.187000 audit: BPF prog-id=109 op=LOAD Jan 24 00:45:19.187000 audit[2689]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=2562 pid=2689 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:19.187000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6264373162396236323430376233313433393933376637313366306338 Jan 24 00:45:19.187000 audit: BPF prog-id=109 op=UNLOAD Jan 24 00:45:19.187000 audit[2689]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2562 pid=2689 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:19.187000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6264373162396236323430376233313433393933376637313366306338 Jan 24 00:45:19.187000 audit: BPF prog-id=110 op=LOAD Jan 24 00:45:19.187000 audit[2689]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=2562 pid=2689 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:19.187000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6264373162396236323430376233313433393933376637313366306338 Jan 24 00:45:19.187000 audit: BPF prog-id=111 op=LOAD Jan 24 00:45:19.187000 audit[2689]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=2562 pid=2689 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:19.187000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6264373162396236323430376233313433393933376637313366306338 Jan 24 00:45:19.187000 audit: BPF prog-id=111 op=UNLOAD Jan 24 00:45:19.187000 audit[2689]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2562 pid=2689 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:19.187000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6264373162396236323430376233313433393933376637313366306338 Jan 24 00:45:19.187000 audit: BPF prog-id=110 op=UNLOAD Jan 24 00:45:19.187000 audit[2689]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2562 pid=2689 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:19.187000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6264373162396236323430376233313433393933376637313366306338 Jan 24 00:45:19.187000 audit: BPF prog-id=112 op=LOAD Jan 24 00:45:19.187000 audit[2689]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=2562 pid=2689 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:19.187000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6264373162396236323430376233313433393933376637313366306338 Jan 24 00:45:19.207687 kubelet[2479]: E0124 00:45:19.207654 2479 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://65.109.167.77:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4593-0-0-9-1308b066bf&limit=500&resourceVersion=0\": dial tcp 65.109.167.77:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Jan 24 00:45:19.213432 containerd[1682]: time="2026-01-24T00:45:19.213175354Z" level=info msg="StartContainer for \"99f2a910656beb11375fc9a8f310ae1f10290d2aaad6ad995ed81d06bd4375ad\" returns successfully" Jan 24 00:45:19.239052 containerd[1682]: time="2026-01-24T00:45:19.239023196Z" level=info msg="StartContainer for \"bd71b9b62407b31439937f713f0c85f23cbb38b07cfc69f4ee61abd6c124d8ad\" returns successfully" Jan 24 00:45:19.391150 kubelet[2479]: E0124 00:45:19.388598 2479 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4593-0-0-9-1308b066bf\" not found" node="ci-4593-0-0-9-1308b066bf" Jan 24 00:45:19.391150 kubelet[2479]: E0124 00:45:19.390199 2479 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4593-0-0-9-1308b066bf\" not found" node="ci-4593-0-0-9-1308b066bf" Jan 24 00:45:19.395492 kubelet[2479]: E0124 00:45:19.395454 2479 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4593-0-0-9-1308b066bf\" not found" node="ci-4593-0-0-9-1308b066bf" Jan 24 00:45:19.879691 kubelet[2479]: I0124 00:45:19.879441 2479 kubelet_node_status.go:75] "Attempting to register node" node="ci-4593-0-0-9-1308b066bf" Jan 24 00:45:20.395112 kubelet[2479]: E0124 00:45:20.394513 2479 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4593-0-0-9-1308b066bf\" not found" node="ci-4593-0-0-9-1308b066bf" Jan 24 00:45:20.395504 kubelet[2479]: E0124 00:45:20.395283 2479 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4593-0-0-9-1308b066bf\" not found" node="ci-4593-0-0-9-1308b066bf" Jan 24 00:45:20.713771 kubelet[2479]: E0124 00:45:20.713489 2479 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4593-0-0-9-1308b066bf\" not found" node="ci-4593-0-0-9-1308b066bf" Jan 24 00:45:20.803354 kubelet[2479]: I0124 00:45:20.803295 2479 kubelet_node_status.go:78] "Successfully registered node" node="ci-4593-0-0-9-1308b066bf" Jan 24 00:45:20.803354 kubelet[2479]: E0124 00:45:20.803345 2479 kubelet_node_status.go:548] "Error updating node status, will retry" err="error getting node \"ci-4593-0-0-9-1308b066bf\": node \"ci-4593-0-0-9-1308b066bf\" not found" Jan 24 00:45:20.813914 kubelet[2479]: E0124 00:45:20.813848 2479 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4593-0-0-9-1308b066bf\" not found" Jan 24 00:45:20.849112 kubelet[2479]: E0124 00:45:20.848993 2479 event.go:359] "Server rejected event (will not retry!)" err="namespaces \"default\" not found" event="&Event{ObjectMeta:{ci-4593-0-0-9-1308b066bf.188d8423e35693ab default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4593-0-0-9-1308b066bf,UID:ci-4593-0-0-9-1308b066bf,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4593-0-0-9-1308b066bf,},FirstTimestamp:2026-01-24 00:45:18.299296683 +0000 UTC m=+0.437743196,LastTimestamp:2026-01-24 00:45:18.299296683 +0000 UTC m=+0.437743196,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4593-0-0-9-1308b066bf,}" Jan 24 00:45:20.914490 kubelet[2479]: E0124 00:45:20.914173 2479 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4593-0-0-9-1308b066bf\" not found" Jan 24 00:45:21.014835 kubelet[2479]: E0124 00:45:21.014669 2479 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4593-0-0-9-1308b066bf\" not found" Jan 24 00:45:21.115920 kubelet[2479]: E0124 00:45:21.115850 2479 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4593-0-0-9-1308b066bf\" not found" Jan 24 00:45:21.216880 kubelet[2479]: E0124 00:45:21.216826 2479 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4593-0-0-9-1308b066bf\" not found" Jan 24 00:45:21.317142 kubelet[2479]: E0124 00:45:21.316960 2479 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4593-0-0-9-1308b066bf\" not found" Jan 24 00:45:21.371797 kubelet[2479]: E0124 00:45:21.371292 2479 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4593-0-0-9-1308b066bf\" not found" node="ci-4593-0-0-9-1308b066bf" Jan 24 00:45:21.417929 kubelet[2479]: E0124 00:45:21.417815 2479 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4593-0-0-9-1308b066bf\" not found" Jan 24 00:45:21.518814 kubelet[2479]: E0124 00:45:21.518727 2479 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4593-0-0-9-1308b066bf\" not found" Jan 24 00:45:21.619695 kubelet[2479]: I0124 00:45:21.619474 2479 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4593-0-0-9-1308b066bf" Jan 24 00:45:21.627394 kubelet[2479]: E0124 00:45:21.627352 2479 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4593-0-0-9-1308b066bf\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4593-0-0-9-1308b066bf" Jan 24 00:45:21.627394 kubelet[2479]: I0124 00:45:21.627380 2479 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4593-0-0-9-1308b066bf" Jan 24 00:45:21.629610 kubelet[2479]: E0124 00:45:21.629548 2479 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4593-0-0-9-1308b066bf\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4593-0-0-9-1308b066bf" Jan 24 00:45:21.629610 kubelet[2479]: I0124 00:45:21.629565 2479 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4593-0-0-9-1308b066bf" Jan 24 00:45:21.630891 kubelet[2479]: E0124 00:45:21.630815 2479 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4593-0-0-9-1308b066bf\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4593-0-0-9-1308b066bf" Jan 24 00:45:22.297356 kubelet[2479]: I0124 00:45:22.297200 2479 apiserver.go:52] "Watching apiserver" Jan 24 00:45:22.317969 kubelet[2479]: I0124 00:45:22.317922 2479 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Jan 24 00:45:22.818219 systemd[1]: Reload requested from client PID 2765 ('systemctl') (unit session-8.scope)... Jan 24 00:45:22.818248 systemd[1]: Reloading... Jan 24 00:45:23.011082 zram_generator::config[2812]: No configuration found. Jan 24 00:45:23.224017 systemd[1]: Reloading finished in 404 ms. Jan 24 00:45:23.250386 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jan 24 00:45:23.263354 systemd[1]: kubelet.service: Deactivated successfully. Jan 24 00:45:23.263746 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 24 00:45:23.263857 systemd[1]: kubelet.service: Consumed 863ms CPU time, 129.2M memory peak. Jan 24 00:45:23.262000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:45:23.265756 kernel: kauditd_printk_skb: 200 callbacks suppressed Jan 24 00:45:23.265816 kernel: audit: type=1131 audit(1769215523.262:399): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:45:23.267504 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 24 00:45:23.279077 kernel: audit: type=1334 audit(1769215523.265:400): prog-id=113 op=LOAD Jan 24 00:45:23.265000 audit: BPF prog-id=113 op=LOAD Jan 24 00:45:23.265000 audit: BPF prog-id=64 op=UNLOAD Jan 24 00:45:23.286116 kernel: audit: type=1334 audit(1769215523.265:401): prog-id=64 op=UNLOAD Jan 24 00:45:23.267000 audit: BPF prog-id=114 op=LOAD Jan 24 00:45:23.289142 kernel: audit: type=1334 audit(1769215523.267:402): prog-id=114 op=LOAD Jan 24 00:45:23.267000 audit: BPF prog-id=73 op=UNLOAD Jan 24 00:45:23.297393 kernel: audit: type=1334 audit(1769215523.267:403): prog-id=73 op=UNLOAD Jan 24 00:45:23.297505 kernel: audit: type=1334 audit(1769215523.267:404): prog-id=115 op=LOAD Jan 24 00:45:23.267000 audit: BPF prog-id=115 op=LOAD Jan 24 00:45:23.267000 audit: BPF prog-id=63 op=UNLOAD Jan 24 00:45:23.302166 kernel: audit: type=1334 audit(1769215523.267:405): prog-id=63 op=UNLOAD Jan 24 00:45:23.273000 audit: BPF prog-id=116 op=LOAD Jan 24 00:45:23.273000 audit: BPF prog-id=70 op=UNLOAD Jan 24 00:45:23.306613 kernel: audit: type=1334 audit(1769215523.273:406): prog-id=116 op=LOAD Jan 24 00:45:23.306663 kernel: audit: type=1334 audit(1769215523.273:407): prog-id=70 op=UNLOAD Jan 24 00:45:23.273000 audit: BPF prog-id=117 op=LOAD Jan 24 00:45:23.310323 kernel: audit: type=1334 audit(1769215523.273:408): prog-id=117 op=LOAD Jan 24 00:45:23.273000 audit: BPF prog-id=118 op=LOAD Jan 24 00:45:23.273000 audit: BPF prog-id=71 op=UNLOAD Jan 24 00:45:23.273000 audit: BPF prog-id=72 op=UNLOAD Jan 24 00:45:23.273000 audit: BPF prog-id=119 op=LOAD Jan 24 00:45:23.273000 audit: BPF prog-id=80 op=UNLOAD Jan 24 00:45:23.273000 audit: BPF prog-id=120 op=LOAD Jan 24 00:45:23.273000 audit: BPF prog-id=121 op=LOAD Jan 24 00:45:23.273000 audit: BPF prog-id=81 op=UNLOAD Jan 24 00:45:23.273000 audit: BPF prog-id=82 op=UNLOAD Jan 24 00:45:23.277000 audit: BPF prog-id=122 op=LOAD Jan 24 00:45:23.277000 audit: BPF prog-id=74 op=UNLOAD Jan 24 00:45:23.277000 audit: BPF prog-id=123 op=LOAD Jan 24 00:45:23.277000 audit: BPF prog-id=124 op=LOAD Jan 24 00:45:23.277000 audit: BPF prog-id=75 op=UNLOAD Jan 24 00:45:23.277000 audit: BPF prog-id=76 op=UNLOAD Jan 24 00:45:23.277000 audit: BPF prog-id=125 op=LOAD Jan 24 00:45:23.284000 audit: BPF prog-id=126 op=LOAD Jan 24 00:45:23.284000 audit: BPF prog-id=65 op=UNLOAD Jan 24 00:45:23.284000 audit: BPF prog-id=66 op=UNLOAD Jan 24 00:45:23.285000 audit: BPF prog-id=127 op=LOAD Jan 24 00:45:23.285000 audit: BPF prog-id=67 op=UNLOAD Jan 24 00:45:23.285000 audit: BPF prog-id=128 op=LOAD Jan 24 00:45:23.285000 audit: BPF prog-id=129 op=LOAD Jan 24 00:45:23.285000 audit: BPF prog-id=68 op=UNLOAD Jan 24 00:45:23.285000 audit: BPF prog-id=69 op=UNLOAD Jan 24 00:45:23.288000 audit: BPF prog-id=130 op=LOAD Jan 24 00:45:23.288000 audit: BPF prog-id=77 op=UNLOAD Jan 24 00:45:23.288000 audit: BPF prog-id=131 op=LOAD Jan 24 00:45:23.288000 audit: BPF prog-id=132 op=LOAD Jan 24 00:45:23.288000 audit: BPF prog-id=78 op=UNLOAD Jan 24 00:45:23.288000 audit: BPF prog-id=79 op=UNLOAD Jan 24 00:45:23.446001 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 24 00:45:23.446000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:45:23.458328 (kubelet)[2863]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 24 00:45:23.531648 kubelet[2863]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 24 00:45:23.531648 kubelet[2863]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jan 24 00:45:23.531648 kubelet[2863]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 24 00:45:23.531648 kubelet[2863]: I0124 00:45:23.530910 2863 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 24 00:45:23.544624 kubelet[2863]: I0124 00:45:23.544567 2863 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Jan 24 00:45:23.544624 kubelet[2863]: I0124 00:45:23.544608 2863 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 24 00:45:23.545176 kubelet[2863]: I0124 00:45:23.545134 2863 server.go:956] "Client rotation is on, will bootstrap in background" Jan 24 00:45:23.548163 kubelet[2863]: I0124 00:45:23.548124 2863 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Jan 24 00:45:23.552407 kubelet[2863]: I0124 00:45:23.552361 2863 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 24 00:45:23.558538 kubelet[2863]: I0124 00:45:23.558516 2863 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 24 00:45:23.566617 kubelet[2863]: I0124 00:45:23.566571 2863 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 24 00:45:23.566939 kubelet[2863]: I0124 00:45:23.566884 2863 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 24 00:45:23.567214 kubelet[2863]: I0124 00:45:23.566923 2863 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4593-0-0-9-1308b066bf","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 24 00:45:23.567214 kubelet[2863]: I0124 00:45:23.567197 2863 topology_manager.go:138] "Creating topology manager with none policy" Jan 24 00:45:23.567214 kubelet[2863]: I0124 00:45:23.567211 2863 container_manager_linux.go:303] "Creating device plugin manager" Jan 24 00:45:23.567395 kubelet[2863]: I0124 00:45:23.567265 2863 state_mem.go:36] "Initialized new in-memory state store" Jan 24 00:45:23.567951 kubelet[2863]: I0124 00:45:23.567507 2863 kubelet.go:480] "Attempting to sync node with API server" Jan 24 00:45:23.567951 kubelet[2863]: I0124 00:45:23.567527 2863 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 24 00:45:23.567951 kubelet[2863]: I0124 00:45:23.567548 2863 kubelet.go:386] "Adding apiserver pod source" Jan 24 00:45:23.567951 kubelet[2863]: I0124 00:45:23.567569 2863 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 24 00:45:23.572050 kubelet[2863]: I0124 00:45:23.571657 2863 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Jan 24 00:45:23.573517 kubelet[2863]: I0124 00:45:23.572850 2863 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Jan 24 00:45:23.583027 kubelet[2863]: I0124 00:45:23.582224 2863 watchdog_linux.go:99] "Systemd watchdog is not enabled" Jan 24 00:45:23.583202 kubelet[2863]: I0124 00:45:23.583179 2863 server.go:1289] "Started kubelet" Jan 24 00:45:23.586282 kubelet[2863]: I0124 00:45:23.586261 2863 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 24 00:45:23.600777 kubelet[2863]: I0124 00:45:23.600649 2863 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Jan 24 00:45:23.602110 kubelet[2863]: I0124 00:45:23.601884 2863 server.go:317] "Adding debug handlers to kubelet server" Jan 24 00:45:23.610106 kubelet[2863]: I0124 00:45:23.609521 2863 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 24 00:45:23.610106 kubelet[2863]: I0124 00:45:23.609786 2863 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 24 00:45:23.610106 kubelet[2863]: I0124 00:45:23.610025 2863 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 24 00:45:23.612182 kubelet[2863]: I0124 00:45:23.612147 2863 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Jan 24 00:45:23.614247 kubelet[2863]: I0124 00:45:23.614217 2863 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Jan 24 00:45:23.614315 kubelet[2863]: I0124 00:45:23.614252 2863 status_manager.go:230] "Starting to sync pod status with apiserver" Jan 24 00:45:23.614315 kubelet[2863]: I0124 00:45:23.614273 2863 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jan 24 00:45:23.614315 kubelet[2863]: I0124 00:45:23.614283 2863 kubelet.go:2436] "Starting kubelet main sync loop" Jan 24 00:45:23.614493 kubelet[2863]: E0124 00:45:23.614336 2863 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 24 00:45:23.614921 kubelet[2863]: I0124 00:45:23.614905 2863 volume_manager.go:297] "Starting Kubelet Volume Manager" Jan 24 00:45:23.616271 kubelet[2863]: I0124 00:45:23.616252 2863 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Jan 24 00:45:23.616613 kubelet[2863]: I0124 00:45:23.616597 2863 reconciler.go:26] "Reconciler: start to sync state" Jan 24 00:45:23.624199 kubelet[2863]: I0124 00:45:23.624173 2863 factory.go:223] Registration of the systemd container factory successfully Jan 24 00:45:23.625126 kubelet[2863]: I0124 00:45:23.624243 2863 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 24 00:45:23.625126 kubelet[2863]: E0124 00:45:23.624398 2863 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 24 00:45:23.626124 kubelet[2863]: I0124 00:45:23.625750 2863 factory.go:223] Registration of the containerd container factory successfully Jan 24 00:45:23.661028 kubelet[2863]: I0124 00:45:23.660988 2863 cpu_manager.go:221] "Starting CPU manager" policy="none" Jan 24 00:45:23.661028 kubelet[2863]: I0124 00:45:23.661017 2863 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jan 24 00:45:23.661176 kubelet[2863]: I0124 00:45:23.661151 2863 state_mem.go:36] "Initialized new in-memory state store" Jan 24 00:45:23.661330 kubelet[2863]: I0124 00:45:23.661310 2863 state_mem.go:88] "Updated default CPUSet" cpuSet="" Jan 24 00:45:23.661348 kubelet[2863]: I0124 00:45:23.661329 2863 state_mem.go:96] "Updated CPUSet assignments" assignments={} Jan 24 00:45:23.661373 kubelet[2863]: I0124 00:45:23.661354 2863 policy_none.go:49] "None policy: Start" Jan 24 00:45:23.661397 kubelet[2863]: I0124 00:45:23.661372 2863 memory_manager.go:186] "Starting memorymanager" policy="None" Jan 24 00:45:23.661397 kubelet[2863]: I0124 00:45:23.661392 2863 state_mem.go:35] "Initializing new in-memory state store" Jan 24 00:45:23.661550 kubelet[2863]: I0124 00:45:23.661531 2863 state_mem.go:75] "Updated machine memory state" Jan 24 00:45:23.668716 kubelet[2863]: E0124 00:45:23.668686 2863 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Jan 24 00:45:23.668882 kubelet[2863]: I0124 00:45:23.668873 2863 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 24 00:45:23.668929 kubelet[2863]: I0124 00:45:23.668912 2863 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 24 00:45:23.669228 kubelet[2863]: I0124 00:45:23.669217 2863 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 24 00:45:23.674609 kubelet[2863]: E0124 00:45:23.674333 2863 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jan 24 00:45:23.716325 kubelet[2863]: I0124 00:45:23.716303 2863 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4593-0-0-9-1308b066bf" Jan 24 00:45:23.716688 kubelet[2863]: I0124 00:45:23.716677 2863 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4593-0-0-9-1308b066bf" Jan 24 00:45:23.716876 kubelet[2863]: I0124 00:45:23.716853 2863 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4593-0-0-9-1308b066bf" Jan 24 00:45:23.716964 kubelet[2863]: I0124 00:45:23.716896 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/3f86d978a1c21216234796498708a4cc-flexvolume-dir\") pod \"kube-controller-manager-ci-4593-0-0-9-1308b066bf\" (UID: \"3f86d978a1c21216234796498708a4cc\") " pod="kube-system/kube-controller-manager-ci-4593-0-0-9-1308b066bf" Jan 24 00:45:23.717004 kubelet[2863]: I0124 00:45:23.716977 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/3f86d978a1c21216234796498708a4cc-ca-certs\") pod \"kube-controller-manager-ci-4593-0-0-9-1308b066bf\" (UID: \"3f86d978a1c21216234796498708a4cc\") " pod="kube-system/kube-controller-manager-ci-4593-0-0-9-1308b066bf" Jan 24 00:45:23.717031 kubelet[2863]: I0124 00:45:23.717003 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/3f86d978a1c21216234796498708a4cc-k8s-certs\") pod \"kube-controller-manager-ci-4593-0-0-9-1308b066bf\" (UID: \"3f86d978a1c21216234796498708a4cc\") " pod="kube-system/kube-controller-manager-ci-4593-0-0-9-1308b066bf" Jan 24 00:45:23.717031 kubelet[2863]: I0124 00:45:23.717025 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/3f86d978a1c21216234796498708a4cc-kubeconfig\") pod \"kube-controller-manager-ci-4593-0-0-9-1308b066bf\" (UID: \"3f86d978a1c21216234796498708a4cc\") " pod="kube-system/kube-controller-manager-ci-4593-0-0-9-1308b066bf" Jan 24 00:45:23.717208 kubelet[2863]: I0124 00:45:23.717050 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/3f86d978a1c21216234796498708a4cc-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4593-0-0-9-1308b066bf\" (UID: \"3f86d978a1c21216234796498708a4cc\") " pod="kube-system/kube-controller-manager-ci-4593-0-0-9-1308b066bf" Jan 24 00:45:23.717208 kubelet[2863]: I0124 00:45:23.717126 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/032bbb787b1aa2baa92b96ada46e1f9a-kubeconfig\") pod \"kube-scheduler-ci-4593-0-0-9-1308b066bf\" (UID: \"032bbb787b1aa2baa92b96ada46e1f9a\") " pod="kube-system/kube-scheduler-ci-4593-0-0-9-1308b066bf" Jan 24 00:45:23.717208 kubelet[2863]: I0124 00:45:23.717140 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/fb10a41fc95ee03e4fcf793bdb671159-ca-certs\") pod \"kube-apiserver-ci-4593-0-0-9-1308b066bf\" (UID: \"fb10a41fc95ee03e4fcf793bdb671159\") " pod="kube-system/kube-apiserver-ci-4593-0-0-9-1308b066bf" Jan 24 00:45:23.717208 kubelet[2863]: I0124 00:45:23.717153 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/fb10a41fc95ee03e4fcf793bdb671159-k8s-certs\") pod \"kube-apiserver-ci-4593-0-0-9-1308b066bf\" (UID: \"fb10a41fc95ee03e4fcf793bdb671159\") " pod="kube-system/kube-apiserver-ci-4593-0-0-9-1308b066bf" Jan 24 00:45:23.717208 kubelet[2863]: I0124 00:45:23.717172 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/fb10a41fc95ee03e4fcf793bdb671159-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4593-0-0-9-1308b066bf\" (UID: \"fb10a41fc95ee03e4fcf793bdb671159\") " pod="kube-system/kube-apiserver-ci-4593-0-0-9-1308b066bf" Jan 24 00:45:23.781188 kubelet[2863]: I0124 00:45:23.781054 2863 kubelet_node_status.go:75] "Attempting to register node" node="ci-4593-0-0-9-1308b066bf" Jan 24 00:45:23.798271 kubelet[2863]: I0124 00:45:23.798131 2863 kubelet_node_status.go:124] "Node was previously registered" node="ci-4593-0-0-9-1308b066bf" Jan 24 00:45:23.798684 kubelet[2863]: I0124 00:45:23.798573 2863 kubelet_node_status.go:78] "Successfully registered node" node="ci-4593-0-0-9-1308b066bf" Jan 24 00:45:24.570146 kubelet[2863]: I0124 00:45:24.570111 2863 apiserver.go:52] "Watching apiserver" Jan 24 00:45:24.617512 kubelet[2863]: I0124 00:45:24.617139 2863 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Jan 24 00:45:24.649170 kubelet[2863]: I0124 00:45:24.648205 2863 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4593-0-0-9-1308b066bf" Jan 24 00:45:24.649170 kubelet[2863]: I0124 00:45:24.648387 2863 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4593-0-0-9-1308b066bf" Jan 24 00:45:24.663989 kubelet[2863]: E0124 00:45:24.663894 2863 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4593-0-0-9-1308b066bf\" already exists" pod="kube-system/kube-scheduler-ci-4593-0-0-9-1308b066bf" Jan 24 00:45:24.664637 kubelet[2863]: E0124 00:45:24.664246 2863 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4593-0-0-9-1308b066bf\" already exists" pod="kube-system/kube-apiserver-ci-4593-0-0-9-1308b066bf" Jan 24 00:45:24.677617 kubelet[2863]: I0124 00:45:24.677371 2863 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4593-0-0-9-1308b066bf" podStartSLOduration=1.676177666 podStartE2EDuration="1.676177666s" podCreationTimestamp="2026-01-24 00:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-24 00:45:24.674953678 +0000 UTC m=+1.206940535" watchObservedRunningTime="2026-01-24 00:45:24.676177666 +0000 UTC m=+1.208164523" Jan 24 00:45:24.688127 kubelet[2863]: I0124 00:45:24.688083 2863 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4593-0-0-9-1308b066bf" podStartSLOduration=1.688073223 podStartE2EDuration="1.688073223s" podCreationTimestamp="2026-01-24 00:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-24 00:45:24.682162299 +0000 UTC m=+1.214149156" watchObservedRunningTime="2026-01-24 00:45:24.688073223 +0000 UTC m=+1.220060050" Jan 24 00:45:24.696241 kubelet[2863]: I0124 00:45:24.696211 2863 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4593-0-0-9-1308b066bf" podStartSLOduration=1.696202276 podStartE2EDuration="1.696202276s" podCreationTimestamp="2026-01-24 00:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-24 00:45:24.688275931 +0000 UTC m=+1.220262758" watchObservedRunningTime="2026-01-24 00:45:24.696202276 +0000 UTC m=+1.228189093" Jan 24 00:45:27.672132 kubelet[2863]: I0124 00:45:27.672055 2863 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Jan 24 00:45:27.672758 containerd[1682]: time="2026-01-24T00:45:27.672582927Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Jan 24 00:45:27.673216 kubelet[2863]: I0124 00:45:27.672818 2863 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Jan 24 00:45:28.553369 systemd[1]: Created slice kubepods-besteffort-pod50789240_7a6d_4e4a_a726_cbc7df2b4e10.slice - libcontainer container kubepods-besteffort-pod50789240_7a6d_4e4a_a726_cbc7df2b4e10.slice. Jan 24 00:45:28.651636 kubelet[2863]: I0124 00:45:28.651598 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/50789240-7a6d-4e4a-a726-cbc7df2b4e10-kube-proxy\") pod \"kube-proxy-mqg5k\" (UID: \"50789240-7a6d-4e4a-a726-cbc7df2b4e10\") " pod="kube-system/kube-proxy-mqg5k" Jan 24 00:45:28.652028 kubelet[2863]: I0124 00:45:28.651816 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/50789240-7a6d-4e4a-a726-cbc7df2b4e10-xtables-lock\") pod \"kube-proxy-mqg5k\" (UID: \"50789240-7a6d-4e4a-a726-cbc7df2b4e10\") " pod="kube-system/kube-proxy-mqg5k" Jan 24 00:45:28.652028 kubelet[2863]: I0124 00:45:28.651837 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/50789240-7a6d-4e4a-a726-cbc7df2b4e10-lib-modules\") pod \"kube-proxy-mqg5k\" (UID: \"50789240-7a6d-4e4a-a726-cbc7df2b4e10\") " pod="kube-system/kube-proxy-mqg5k" Jan 24 00:45:28.652028 kubelet[2863]: I0124 00:45:28.651862 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xzjg\" (UniqueName: \"kubernetes.io/projected/50789240-7a6d-4e4a-a726-cbc7df2b4e10-kube-api-access-9xzjg\") pod \"kube-proxy-mqg5k\" (UID: \"50789240-7a6d-4e4a-a726-cbc7df2b4e10\") " pod="kube-system/kube-proxy-mqg5k" Jan 24 00:45:28.859778 containerd[1682]: time="2026-01-24T00:45:28.859577537Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-mqg5k,Uid:50789240-7a6d-4e4a-a726-cbc7df2b4e10,Namespace:kube-system,Attempt:0,}" Jan 24 00:45:28.870970 systemd[1]: Created slice kubepods-besteffort-pod1deb031e_6e9b_46ac_8d9a_64b60fe3b32b.slice - libcontainer container kubepods-besteffort-pod1deb031e_6e9b_46ac_8d9a_64b60fe3b32b.slice. Jan 24 00:45:28.887715 containerd[1682]: time="2026-01-24T00:45:28.886523164Z" level=info msg="connecting to shim 572656723705203c6175f044f27e219d95ffc53f3d005481a4bfcf9c92863782" address="unix:///run/containerd/s/d0b2dc604f2cfb480ee710db7f213d0a92a4462b93c7d70a01651101ee50b490" namespace=k8s.io protocol=ttrpc version=3 Jan 24 00:45:28.921186 systemd[1]: Started cri-containerd-572656723705203c6175f044f27e219d95ffc53f3d005481a4bfcf9c92863782.scope - libcontainer container 572656723705203c6175f044f27e219d95ffc53f3d005481a4bfcf9c92863782. Jan 24 00:45:28.938857 kernel: kauditd_printk_skb: 32 callbacks suppressed Jan 24 00:45:28.938938 kernel: audit: type=1334 audit(1769215528.935:441): prog-id=133 op=LOAD Jan 24 00:45:28.935000 audit: BPF prog-id=133 op=LOAD Jan 24 00:45:28.941000 audit: BPF prog-id=134 op=LOAD Jan 24 00:45:28.945101 kernel: audit: type=1334 audit(1769215528.941:442): prog-id=134 op=LOAD Jan 24 00:45:28.945483 kernel: audit: type=1300 audit(1769215528.941:442): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000206238 a2=98 a3=0 items=0 ppid=2920 pid=2934 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:28.941000 audit[2934]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000206238 a2=98 a3=0 items=0 ppid=2920 pid=2934 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:28.954957 kernel: audit: type=1327 audit(1769215528.941:442): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3537323635363732333730353230336336313735663034346632376532 Jan 24 00:45:28.941000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3537323635363732333730353230336336313735663034346632376532 Jan 24 00:45:28.955597 kubelet[2863]: I0124 00:45:28.955482 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/1deb031e-6e9b-46ac-8d9a-64b60fe3b32b-var-lib-calico\") pod \"tigera-operator-7dcd859c48-w5567\" (UID: \"1deb031e-6e9b-46ac-8d9a-64b60fe3b32b\") " pod="tigera-operator/tigera-operator-7dcd859c48-w5567" Jan 24 00:45:28.955597 kubelet[2863]: I0124 00:45:28.955528 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dt4mc\" (UniqueName: \"kubernetes.io/projected/1deb031e-6e9b-46ac-8d9a-64b60fe3b32b-kube-api-access-dt4mc\") pod \"tigera-operator-7dcd859c48-w5567\" (UID: \"1deb031e-6e9b-46ac-8d9a-64b60fe3b32b\") " pod="tigera-operator/tigera-operator-7dcd859c48-w5567" Jan 24 00:45:28.958450 kernel: audit: type=1334 audit(1769215528.941:443): prog-id=134 op=UNLOAD Jan 24 00:45:28.941000 audit: BPF prog-id=134 op=UNLOAD Jan 24 00:45:28.961970 kernel: audit: type=1300 audit(1769215528.941:443): arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2920 pid=2934 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:28.941000 audit[2934]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2920 pid=2934 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:28.941000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3537323635363732333730353230336336313735663034346632376532 Jan 24 00:45:28.972075 kernel: audit: type=1327 audit(1769215528.941:443): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3537323635363732333730353230336336313735663034346632376532 Jan 24 00:45:28.972112 kernel: audit: type=1334 audit(1769215528.941:444): prog-id=135 op=LOAD Jan 24 00:45:28.941000 audit: BPF prog-id=135 op=LOAD Jan 24 00:45:28.941000 audit[2934]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000206488 a2=98 a3=0 items=0 ppid=2920 pid=2934 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:28.981123 kernel: audit: type=1300 audit(1769215528.941:444): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000206488 a2=98 a3=0 items=0 ppid=2920 pid=2934 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:28.981197 kernel: audit: type=1327 audit(1769215528.941:444): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3537323635363732333730353230336336313735663034346632376532 Jan 24 00:45:28.941000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3537323635363732333730353230336336313735663034346632376532 Jan 24 00:45:28.941000 audit: BPF prog-id=136 op=LOAD Jan 24 00:45:28.941000 audit[2934]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000206218 a2=98 a3=0 items=0 ppid=2920 pid=2934 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:28.941000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3537323635363732333730353230336336313735663034346632376532 Jan 24 00:45:28.941000 audit: BPF prog-id=136 op=UNLOAD Jan 24 00:45:28.941000 audit[2934]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2920 pid=2934 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:28.941000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3537323635363732333730353230336336313735663034346632376532 Jan 24 00:45:28.941000 audit: BPF prog-id=135 op=UNLOAD Jan 24 00:45:28.941000 audit[2934]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2920 pid=2934 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:28.941000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3537323635363732333730353230336336313735663034346632376532 Jan 24 00:45:28.941000 audit: BPF prog-id=137 op=LOAD Jan 24 00:45:28.941000 audit[2934]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0002066e8 a2=98 a3=0 items=0 ppid=2920 pid=2934 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:28.941000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3537323635363732333730353230336336313735663034346632376532 Jan 24 00:45:28.988362 containerd[1682]: time="2026-01-24T00:45:28.988326143Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-mqg5k,Uid:50789240-7a6d-4e4a-a726-cbc7df2b4e10,Namespace:kube-system,Attempt:0,} returns sandbox id \"572656723705203c6175f044f27e219d95ffc53f3d005481a4bfcf9c92863782\"" Jan 24 00:45:28.994755 containerd[1682]: time="2026-01-24T00:45:28.994737266Z" level=info msg="CreateContainer within sandbox \"572656723705203c6175f044f27e219d95ffc53f3d005481a4bfcf9c92863782\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Jan 24 00:45:29.010465 containerd[1682]: time="2026-01-24T00:45:29.008896339Z" level=info msg="Container 4d175fe3487afeef0cf111bfce2b950a73cfc045085f56ee0deb3163aa82107f: CDI devices from CRI Config.CDIDevices: []" Jan 24 00:45:29.023326 containerd[1682]: time="2026-01-24T00:45:29.023286883Z" level=info msg="CreateContainer within sandbox \"572656723705203c6175f044f27e219d95ffc53f3d005481a4bfcf9c92863782\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"4d175fe3487afeef0cf111bfce2b950a73cfc045085f56ee0deb3163aa82107f\"" Jan 24 00:45:29.023930 containerd[1682]: time="2026-01-24T00:45:29.023895948Z" level=info msg="StartContainer for \"4d175fe3487afeef0cf111bfce2b950a73cfc045085f56ee0deb3163aa82107f\"" Jan 24 00:45:29.025261 containerd[1682]: time="2026-01-24T00:45:29.025184499Z" level=info msg="connecting to shim 4d175fe3487afeef0cf111bfce2b950a73cfc045085f56ee0deb3163aa82107f" address="unix:///run/containerd/s/d0b2dc604f2cfb480ee710db7f213d0a92a4462b93c7d70a01651101ee50b490" protocol=ttrpc version=3 Jan 24 00:45:29.051303 systemd[1]: Started cri-containerd-4d175fe3487afeef0cf111bfce2b950a73cfc045085f56ee0deb3163aa82107f.scope - libcontainer container 4d175fe3487afeef0cf111bfce2b950a73cfc045085f56ee0deb3163aa82107f. Jan 24 00:45:29.133000 audit: BPF prog-id=138 op=LOAD Jan 24 00:45:29.133000 audit[2959]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=2920 pid=2959 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.133000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3464313735666533343837616665656630636631313162666365326239 Jan 24 00:45:29.133000 audit: BPF prog-id=139 op=LOAD Jan 24 00:45:29.133000 audit[2959]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=2920 pid=2959 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.133000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3464313735666533343837616665656630636631313162666365326239 Jan 24 00:45:29.133000 audit: BPF prog-id=139 op=UNLOAD Jan 24 00:45:29.133000 audit[2959]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2920 pid=2959 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.133000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3464313735666533343837616665656630636631313162666365326239 Jan 24 00:45:29.133000 audit: BPF prog-id=138 op=UNLOAD Jan 24 00:45:29.133000 audit[2959]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2920 pid=2959 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.133000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3464313735666533343837616665656630636631313162666365326239 Jan 24 00:45:29.133000 audit: BPF prog-id=140 op=LOAD Jan 24 00:45:29.133000 audit[2959]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=2920 pid=2959 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.133000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3464313735666533343837616665656630636631313162666365326239 Jan 24 00:45:29.169808 containerd[1682]: time="2026-01-24T00:45:29.169764634Z" level=info msg="StartContainer for \"4d175fe3487afeef0cf111bfce2b950a73cfc045085f56ee0deb3163aa82107f\" returns successfully" Jan 24 00:45:29.177762 containerd[1682]: time="2026-01-24T00:45:29.177714450Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-w5567,Uid:1deb031e-6e9b-46ac-8d9a-64b60fe3b32b,Namespace:tigera-operator,Attempt:0,}" Jan 24 00:45:29.206301 containerd[1682]: time="2026-01-24T00:45:29.206248860Z" level=info msg="connecting to shim c95907dae8b1dd0b335eb404719b5ebdd429e35f13988537f8334ea5c995b8da" address="unix:///run/containerd/s/4e60b15fb44aa56a176acf912469d108694efd9035acae36a1b93ca7161871ed" namespace=k8s.io protocol=ttrpc version=3 Jan 24 00:45:29.227210 systemd[1]: Started cri-containerd-c95907dae8b1dd0b335eb404719b5ebdd429e35f13988537f8334ea5c995b8da.scope - libcontainer container c95907dae8b1dd0b335eb404719b5ebdd429e35f13988537f8334ea5c995b8da. Jan 24 00:45:29.240000 audit: BPF prog-id=141 op=LOAD Jan 24 00:45:29.241000 audit: BPF prog-id=142 op=LOAD Jan 24 00:45:29.241000 audit[3012]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a0238 a2=98 a3=0 items=0 ppid=3000 pid=3012 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.241000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339353930376461653862316464306233333565623430343731396235 Jan 24 00:45:29.241000 audit: BPF prog-id=142 op=UNLOAD Jan 24 00:45:29.241000 audit[3012]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3000 pid=3012 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.241000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339353930376461653862316464306233333565623430343731396235 Jan 24 00:45:29.241000 audit: BPF prog-id=143 op=LOAD Jan 24 00:45:29.241000 audit[3012]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a0488 a2=98 a3=0 items=0 ppid=3000 pid=3012 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.241000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339353930376461653862316464306233333565623430343731396235 Jan 24 00:45:29.241000 audit: BPF prog-id=144 op=LOAD Jan 24 00:45:29.241000 audit[3012]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c0001a0218 a2=98 a3=0 items=0 ppid=3000 pid=3012 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.241000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339353930376461653862316464306233333565623430343731396235 Jan 24 00:45:29.241000 audit: BPF prog-id=144 op=UNLOAD Jan 24 00:45:29.241000 audit[3012]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3000 pid=3012 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.241000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339353930376461653862316464306233333565623430343731396235 Jan 24 00:45:29.241000 audit: BPF prog-id=143 op=UNLOAD Jan 24 00:45:29.241000 audit[3012]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3000 pid=3012 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.241000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339353930376461653862316464306233333565623430343731396235 Jan 24 00:45:29.241000 audit: BPF prog-id=145 op=LOAD Jan 24 00:45:29.241000 audit[3012]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a06e8 a2=98 a3=0 items=0 ppid=3000 pid=3012 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.241000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339353930376461653862316464306233333565623430343731396235 Jan 24 00:45:29.280550 containerd[1682]: time="2026-01-24T00:45:29.280488834Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-w5567,Uid:1deb031e-6e9b-46ac-8d9a-64b60fe3b32b,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"c95907dae8b1dd0b335eb404719b5ebdd429e35f13988537f8334ea5c995b8da\"" Jan 24 00:45:29.283217 containerd[1682]: time="2026-01-24T00:45:29.282462461Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\"" Jan 24 00:45:29.351000 audit[3069]: NETFILTER_CFG table=mangle:54 family=10 entries=1 op=nft_register_chain pid=3069 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 24 00:45:29.351000 audit[3069]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7fffca033240 a2=0 a3=7fffca03322c items=0 ppid=2972 pid=3069 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.351000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Jan 24 00:45:29.357000 audit[3072]: NETFILTER_CFG table=nat:55 family=10 entries=1 op=nft_register_chain pid=3072 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 24 00:45:29.357000 audit[3072]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffcd62e8e40 a2=0 a3=7ffcd62e8e2c items=0 ppid=2972 pid=3072 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.357000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Jan 24 00:45:29.360000 audit[3074]: NETFILTER_CFG table=filter:56 family=10 entries=1 op=nft_register_chain pid=3074 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 24 00:45:29.360000 audit[3074]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffd73edbfd0 a2=0 a3=7ffd73edbfbc items=0 ppid=2972 pid=3074 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.360000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Jan 24 00:45:29.362000 audit[3075]: NETFILTER_CFG table=mangle:57 family=2 entries=1 op=nft_register_chain pid=3075 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 24 00:45:29.362000 audit[3075]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffc68fba9a0 a2=0 a3=7ffc68fba98c items=0 ppid=2972 pid=3075 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.362000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Jan 24 00:45:29.365000 audit[3076]: NETFILTER_CFG table=nat:58 family=2 entries=1 op=nft_register_chain pid=3076 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 24 00:45:29.365000 audit[3076]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffdf0b7b9d0 a2=0 a3=7ffdf0b7b9bc items=0 ppid=2972 pid=3076 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.365000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Jan 24 00:45:29.366000 audit[3077]: NETFILTER_CFG table=filter:59 family=2 entries=1 op=nft_register_chain pid=3077 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 24 00:45:29.366000 audit[3077]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffd97241800 a2=0 a3=7ffd972417ec items=0 ppid=2972 pid=3077 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.366000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Jan 24 00:45:29.455000 audit[3078]: NETFILTER_CFG table=filter:60 family=2 entries=1 op=nft_register_chain pid=3078 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 24 00:45:29.455000 audit[3078]: SYSCALL arch=c000003e syscall=46 success=yes exit=108 a0=3 a1=7fff99c43770 a2=0 a3=7fff99c4375c items=0 ppid=2972 pid=3078 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.455000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Jan 24 00:45:29.464000 audit[3080]: NETFILTER_CFG table=filter:61 family=2 entries=1 op=nft_register_rule pid=3080 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 24 00:45:29.464000 audit[3080]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7ffcf2c487f0 a2=0 a3=7ffcf2c487dc items=0 ppid=2972 pid=3080 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.464000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276696365 Jan 24 00:45:29.473000 audit[3083]: NETFILTER_CFG table=filter:62 family=2 entries=1 op=nft_register_rule pid=3083 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 24 00:45:29.473000 audit[3083]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7ffd76350390 a2=0 a3=7ffd7635037c items=0 ppid=2972 pid=3083 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.473000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669 Jan 24 00:45:29.476000 audit[3084]: NETFILTER_CFG table=filter:63 family=2 entries=1 op=nft_register_chain pid=3084 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 24 00:45:29.476000 audit[3084]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffd62c734b0 a2=0 a3=7ffd62c7349c items=0 ppid=2972 pid=3084 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.476000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Jan 24 00:45:29.483000 audit[3086]: NETFILTER_CFG table=filter:64 family=2 entries=1 op=nft_register_rule pid=3086 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 24 00:45:29.483000 audit[3086]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffd9c290a30 a2=0 a3=7ffd9c290a1c items=0 ppid=2972 pid=3086 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.483000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Jan 24 00:45:29.487000 audit[3087]: NETFILTER_CFG table=filter:65 family=2 entries=1 op=nft_register_chain pid=3087 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 24 00:45:29.487000 audit[3087]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffcd866c550 a2=0 a3=7ffcd866c53c items=0 ppid=2972 pid=3087 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.487000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Jan 24 00:45:29.495000 audit[3089]: NETFILTER_CFG table=filter:66 family=2 entries=1 op=nft_register_rule pid=3089 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 24 00:45:29.495000 audit[3089]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7ffc417c7cb0 a2=0 a3=7ffc417c7c9c items=0 ppid=2972 pid=3089 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.495000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Jan 24 00:45:29.505000 audit[3092]: NETFILTER_CFG table=filter:67 family=2 entries=1 op=nft_register_rule pid=3092 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 24 00:45:29.505000 audit[3092]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7fff96738070 a2=0 a3=7fff9673805c items=0 ppid=2972 pid=3092 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.505000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D53 Jan 24 00:45:29.508000 audit[3093]: NETFILTER_CFG table=filter:68 family=2 entries=1 op=nft_register_chain pid=3093 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 24 00:45:29.508000 audit[3093]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff198fa700 a2=0 a3=7fff198fa6ec items=0 ppid=2972 pid=3093 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.508000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Jan 24 00:45:29.515000 audit[3095]: NETFILTER_CFG table=filter:69 family=2 entries=1 op=nft_register_rule pid=3095 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 24 00:45:29.515000 audit[3095]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7fff42d97ae0 a2=0 a3=7fff42d97acc items=0 ppid=2972 pid=3095 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.515000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Jan 24 00:45:29.518000 audit[3096]: NETFILTER_CFG table=filter:70 family=2 entries=1 op=nft_register_chain pid=3096 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 24 00:45:29.518000 audit[3096]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffe735af340 a2=0 a3=7ffe735af32c items=0 ppid=2972 pid=3096 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.518000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Jan 24 00:45:29.525000 audit[3098]: NETFILTER_CFG table=filter:71 family=2 entries=1 op=nft_register_rule pid=3098 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 24 00:45:29.525000 audit[3098]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffc891f6b80 a2=0 a3=7ffc891f6b6c items=0 ppid=2972 pid=3098 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.525000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Jan 24 00:45:29.535000 audit[3101]: NETFILTER_CFG table=filter:72 family=2 entries=1 op=nft_register_rule pid=3101 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 24 00:45:29.535000 audit[3101]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffc9668fd20 a2=0 a3=7ffc9668fd0c items=0 ppid=2972 pid=3101 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.535000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Jan 24 00:45:29.549000 audit[3104]: NETFILTER_CFG table=filter:73 family=2 entries=1 op=nft_register_rule pid=3104 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 24 00:45:29.549000 audit[3104]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffc655a6850 a2=0 a3=7ffc655a683c items=0 ppid=2972 pid=3104 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.549000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Jan 24 00:45:29.552000 audit[3105]: NETFILTER_CFG table=nat:74 family=2 entries=1 op=nft_register_chain pid=3105 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 24 00:45:29.552000 audit[3105]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffe52b4c960 a2=0 a3=7ffe52b4c94c items=0 ppid=2972 pid=3105 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.552000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Jan 24 00:45:29.558000 audit[3107]: NETFILTER_CFG table=nat:75 family=2 entries=1 op=nft_register_rule pid=3107 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 24 00:45:29.558000 audit[3107]: SYSCALL arch=c000003e syscall=46 success=yes exit=524 a0=3 a1=7ffd373bee10 a2=0 a3=7ffd373bedfc items=0 ppid=2972 pid=3107 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.558000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 24 00:45:29.568000 audit[3110]: NETFILTER_CFG table=nat:76 family=2 entries=1 op=nft_register_rule pid=3110 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 24 00:45:29.568000 audit[3110]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffc4f9c3980 a2=0 a3=7ffc4f9c396c items=0 ppid=2972 pid=3110 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.568000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 24 00:45:29.571000 audit[3111]: NETFILTER_CFG table=nat:77 family=2 entries=1 op=nft_register_chain pid=3111 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 24 00:45:29.571000 audit[3111]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe70260880 a2=0 a3=7ffe7026086c items=0 ppid=2972 pid=3111 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.571000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Jan 24 00:45:29.578000 audit[3113]: NETFILTER_CFG table=nat:78 family=2 entries=1 op=nft_register_rule pid=3113 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 24 00:45:29.578000 audit[3113]: SYSCALL arch=c000003e syscall=46 success=yes exit=532 a0=3 a1=7ffeb389f570 a2=0 a3=7ffeb389f55c items=0 ppid=2972 pid=3113 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.578000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Jan 24 00:45:29.621000 audit[3119]: NETFILTER_CFG table=filter:79 family=2 entries=8 op=nft_register_rule pid=3119 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 24 00:45:29.621000 audit[3119]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7ffd069d2270 a2=0 a3=7ffd069d225c items=0 ppid=2972 pid=3119 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.621000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 24 00:45:29.634000 audit[3119]: NETFILTER_CFG table=nat:80 family=2 entries=14 op=nft_register_chain pid=3119 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 24 00:45:29.634000 audit[3119]: SYSCALL arch=c000003e syscall=46 success=yes exit=5508 a0=3 a1=7ffd069d2270 a2=0 a3=7ffd069d225c items=0 ppid=2972 pid=3119 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.634000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 24 00:45:29.639000 audit[3124]: NETFILTER_CFG table=filter:81 family=10 entries=1 op=nft_register_chain pid=3124 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 24 00:45:29.639000 audit[3124]: SYSCALL arch=c000003e syscall=46 success=yes exit=108 a0=3 a1=7ffce07def40 a2=0 a3=7ffce07def2c items=0 ppid=2972 pid=3124 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.639000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Jan 24 00:45:29.647000 audit[3126]: NETFILTER_CFG table=filter:82 family=10 entries=2 op=nft_register_chain pid=3126 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 24 00:45:29.647000 audit[3126]: SYSCALL arch=c000003e syscall=46 success=yes exit=836 a0=3 a1=7ffe85338240 a2=0 a3=7ffe8533822c items=0 ppid=2972 pid=3126 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.647000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C6520736572766963 Jan 24 00:45:29.661000 audit[3129]: NETFILTER_CFG table=filter:83 family=10 entries=1 op=nft_register_rule pid=3129 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 24 00:45:29.661000 audit[3129]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7ffcffde11b0 a2=0 a3=7ffcffde119c items=0 ppid=2972 pid=3129 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.661000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276 Jan 24 00:45:29.666000 audit[3130]: NETFILTER_CFG table=filter:84 family=10 entries=1 op=nft_register_chain pid=3130 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 24 00:45:29.666000 audit[3130]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe18224b00 a2=0 a3=7ffe18224aec items=0 ppid=2972 pid=3130 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.666000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Jan 24 00:45:29.674000 audit[3132]: NETFILTER_CFG table=filter:85 family=10 entries=1 op=nft_register_rule pid=3132 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 24 00:45:29.674000 audit[3132]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffd0a8f8aa0 a2=0 a3=7ffd0a8f8a8c items=0 ppid=2972 pid=3132 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.674000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Jan 24 00:45:29.679000 audit[3133]: NETFILTER_CFG table=filter:86 family=10 entries=1 op=nft_register_chain pid=3133 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 24 00:45:29.679000 audit[3133]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff74b43bf0 a2=0 a3=7fff74b43bdc items=0 ppid=2972 pid=3133 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.679000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Jan 24 00:45:29.689000 audit[3135]: NETFILTER_CFG table=filter:87 family=10 entries=1 op=nft_register_rule pid=3135 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 24 00:45:29.689000 audit[3135]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7ffdde164ac0 a2=0 a3=7ffdde164aac items=0 ppid=2972 pid=3135 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.689000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B554245 Jan 24 00:45:29.699000 audit[3138]: NETFILTER_CFG table=filter:88 family=10 entries=2 op=nft_register_chain pid=3138 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 24 00:45:29.699000 audit[3138]: SYSCALL arch=c000003e syscall=46 success=yes exit=828 a0=3 a1=7fffcef18fb0 a2=0 a3=7fffcef18f9c items=0 ppid=2972 pid=3138 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.699000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Jan 24 00:45:29.702000 audit[3139]: NETFILTER_CFG table=filter:89 family=10 entries=1 op=nft_register_chain pid=3139 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 24 00:45:29.702000 audit[3139]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff654b8120 a2=0 a3=7fff654b810c items=0 ppid=2972 pid=3139 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.702000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Jan 24 00:45:29.708000 audit[3141]: NETFILTER_CFG table=filter:90 family=10 entries=1 op=nft_register_rule pid=3141 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 24 00:45:29.708000 audit[3141]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffcb802ab50 a2=0 a3=7ffcb802ab3c items=0 ppid=2972 pid=3141 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.708000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Jan 24 00:45:29.713000 audit[3142]: NETFILTER_CFG table=filter:91 family=10 entries=1 op=nft_register_chain pid=3142 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 24 00:45:29.713000 audit[3142]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7fffa77b1650 a2=0 a3=7fffa77b163c items=0 ppid=2972 pid=3142 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.713000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Jan 24 00:45:29.721000 audit[3144]: NETFILTER_CFG table=filter:92 family=10 entries=1 op=nft_register_rule pid=3144 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 24 00:45:29.721000 audit[3144]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffc91d0ee00 a2=0 a3=7ffc91d0edec items=0 ppid=2972 pid=3144 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.721000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Jan 24 00:45:29.731000 audit[3147]: NETFILTER_CFG table=filter:93 family=10 entries=1 op=nft_register_rule pid=3147 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 24 00:45:29.731000 audit[3147]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffd192c4670 a2=0 a3=7ffd192c465c items=0 ppid=2972 pid=3147 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.731000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Jan 24 00:45:29.741000 audit[3150]: NETFILTER_CFG table=filter:94 family=10 entries=1 op=nft_register_rule pid=3150 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 24 00:45:29.741000 audit[3150]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffe9c3c47f0 a2=0 a3=7ffe9c3c47dc items=0 ppid=2972 pid=3150 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.741000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C Jan 24 00:45:29.744000 audit[3151]: NETFILTER_CFG table=nat:95 family=10 entries=1 op=nft_register_chain pid=3151 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 24 00:45:29.744000 audit[3151]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffd73c99060 a2=0 a3=7ffd73c9904c items=0 ppid=2972 pid=3151 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.744000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Jan 24 00:45:29.750000 audit[3153]: NETFILTER_CFG table=nat:96 family=10 entries=1 op=nft_register_rule pid=3153 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 24 00:45:29.750000 audit[3153]: SYSCALL arch=c000003e syscall=46 success=yes exit=524 a0=3 a1=7fff621a6710 a2=0 a3=7fff621a66fc items=0 ppid=2972 pid=3153 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.750000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 24 00:45:29.779000 audit[3156]: NETFILTER_CFG table=nat:97 family=10 entries=1 op=nft_register_rule pid=3156 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 24 00:45:29.779000 audit[3156]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffd5758f490 a2=0 a3=7ffd5758f47c items=0 ppid=2972 pid=3156 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.779000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 24 00:45:29.782000 audit[3157]: NETFILTER_CFG table=nat:98 family=10 entries=1 op=nft_register_chain pid=3157 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 24 00:45:29.782000 audit[3157]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffed1577d20 a2=0 a3=7ffed1577d0c items=0 ppid=2972 pid=3157 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.782000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Jan 24 00:45:29.789000 audit[3159]: NETFILTER_CFG table=nat:99 family=10 entries=2 op=nft_register_chain pid=3159 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 24 00:45:29.789000 audit[3159]: SYSCALL arch=c000003e syscall=46 success=yes exit=612 a0=3 a1=7ffc6786de10 a2=0 a3=7ffc6786ddfc items=0 ppid=2972 pid=3159 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.789000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Jan 24 00:45:29.792000 audit[3160]: NETFILTER_CFG table=filter:100 family=10 entries=1 op=nft_register_chain pid=3160 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 24 00:45:29.792000 audit[3160]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fffbacd6a50 a2=0 a3=7fffbacd6a3c items=0 ppid=2972 pid=3160 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.792000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Jan 24 00:45:29.798000 audit[3162]: NETFILTER_CFG table=filter:101 family=10 entries=1 op=nft_register_rule pid=3162 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 24 00:45:29.798000 audit[3162]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7ffd76a33900 a2=0 a3=7ffd76a338ec items=0 ppid=2972 pid=3162 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.798000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 24 00:45:29.807000 audit[3165]: NETFILTER_CFG table=filter:102 family=10 entries=1 op=nft_register_rule pid=3165 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 24 00:45:29.807000 audit[3165]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7ffe9f76e0a0 a2=0 a3=7ffe9f76e08c items=0 ppid=2972 pid=3165 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.807000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 24 00:45:29.816000 audit[3167]: NETFILTER_CFG table=filter:103 family=10 entries=3 op=nft_register_rule pid=3167 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Jan 24 00:45:29.816000 audit[3167]: SYSCALL arch=c000003e syscall=46 success=yes exit=2088 a0=3 a1=7ffe5a657e00 a2=0 a3=7ffe5a657dec items=0 ppid=2972 pid=3167 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.816000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 24 00:45:29.817000 audit[3167]: NETFILTER_CFG table=nat:104 family=10 entries=7 op=nft_register_chain pid=3167 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Jan 24 00:45:29.817000 audit[3167]: SYSCALL arch=c000003e syscall=46 success=yes exit=2056 a0=3 a1=7ffe5a657e00 a2=0 a3=7ffe5a657dec items=0 ppid=2972 pid=3167 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:29.817000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 24 00:45:31.018656 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount362860582.mount: Deactivated successfully. Jan 24 00:45:31.565755 containerd[1682]: time="2026-01-24T00:45:31.565698801Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 24 00:45:31.566952 containerd[1682]: time="2026-01-24T00:45:31.566838554Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.7: active requests=0, bytes read=23558205" Jan 24 00:45:31.567807 containerd[1682]: time="2026-01-24T00:45:31.567783029Z" level=info msg="ImageCreate event name:\"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 24 00:45:31.569550 containerd[1682]: time="2026-01-24T00:45:31.569524269Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 24 00:45:31.570157 containerd[1682]: time="2026-01-24T00:45:31.569999756Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.7\" with image id \"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\", repo tag \"quay.io/tigera/operator:v1.38.7\", repo digest \"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\", size \"25057686\" in 2.28524807s" Jan 24 00:45:31.570157 containerd[1682]: time="2026-01-24T00:45:31.570029905Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\" returns image reference \"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\"" Jan 24 00:45:31.572909 containerd[1682]: time="2026-01-24T00:45:31.572894099Z" level=info msg="CreateContainer within sandbox \"c95907dae8b1dd0b335eb404719b5ebdd429e35f13988537f8334ea5c995b8da\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Jan 24 00:45:31.583598 containerd[1682]: time="2026-01-24T00:45:31.583563648Z" level=info msg="Container fb20456a76a221b5ce242f0d74a9ba6ac0ba3bc345c3042096c7036a3f9c13ce: CDI devices from CRI Config.CDIDevices: []" Jan 24 00:45:31.588607 containerd[1682]: time="2026-01-24T00:45:31.588565959Z" level=info msg="CreateContainer within sandbox \"c95907dae8b1dd0b335eb404719b5ebdd429e35f13988537f8334ea5c995b8da\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"fb20456a76a221b5ce242f0d74a9ba6ac0ba3bc345c3042096c7036a3f9c13ce\"" Jan 24 00:45:31.589091 containerd[1682]: time="2026-01-24T00:45:31.589049466Z" level=info msg="StartContainer for \"fb20456a76a221b5ce242f0d74a9ba6ac0ba3bc345c3042096c7036a3f9c13ce\"" Jan 24 00:45:31.589670 containerd[1682]: time="2026-01-24T00:45:31.589640603Z" level=info msg="connecting to shim fb20456a76a221b5ce242f0d74a9ba6ac0ba3bc345c3042096c7036a3f9c13ce" address="unix:///run/containerd/s/4e60b15fb44aa56a176acf912469d108694efd9035acae36a1b93ca7161871ed" protocol=ttrpc version=3 Jan 24 00:45:31.606199 systemd[1]: Started cri-containerd-fb20456a76a221b5ce242f0d74a9ba6ac0ba3bc345c3042096c7036a3f9c13ce.scope - libcontainer container fb20456a76a221b5ce242f0d74a9ba6ac0ba3bc345c3042096c7036a3f9c13ce. Jan 24 00:45:31.617000 audit: BPF prog-id=146 op=LOAD Jan 24 00:45:31.618000 audit: BPF prog-id=147 op=LOAD Jan 24 00:45:31.618000 audit[3176]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=3000 pid=3176 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:31.618000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6662323034353661373661323231623563653234326630643734613962 Jan 24 00:45:31.618000 audit: BPF prog-id=147 op=UNLOAD Jan 24 00:45:31.618000 audit[3176]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3000 pid=3176 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:31.618000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6662323034353661373661323231623563653234326630643734613962 Jan 24 00:45:31.618000 audit: BPF prog-id=148 op=LOAD Jan 24 00:45:31.618000 audit[3176]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=3000 pid=3176 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:31.618000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6662323034353661373661323231623563653234326630643734613962 Jan 24 00:45:31.618000 audit: BPF prog-id=149 op=LOAD Jan 24 00:45:31.618000 audit[3176]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=3000 pid=3176 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:31.618000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6662323034353661373661323231623563653234326630643734613962 Jan 24 00:45:31.618000 audit: BPF prog-id=149 op=UNLOAD Jan 24 00:45:31.618000 audit[3176]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3000 pid=3176 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:31.618000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6662323034353661373661323231623563653234326630643734613962 Jan 24 00:45:31.618000 audit: BPF prog-id=148 op=UNLOAD Jan 24 00:45:31.618000 audit[3176]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3000 pid=3176 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:31.618000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6662323034353661373661323231623563653234326630643734613962 Jan 24 00:45:31.618000 audit: BPF prog-id=150 op=LOAD Jan 24 00:45:31.618000 audit[3176]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=3000 pid=3176 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:31.618000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6662323034353661373661323231623563653234326630643734613962 Jan 24 00:45:31.636456 containerd[1682]: time="2026-01-24T00:45:31.636419843Z" level=info msg="StartContainer for \"fb20456a76a221b5ce242f0d74a9ba6ac0ba3bc345c3042096c7036a3f9c13ce\" returns successfully" Jan 24 00:45:31.675011 kubelet[2863]: I0124 00:45:31.674963 2863 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-mqg5k" podStartSLOduration=3.67494919 podStartE2EDuration="3.67494919s" podCreationTimestamp="2026-01-24 00:45:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-24 00:45:29.679611278 +0000 UTC m=+6.211598135" watchObservedRunningTime="2026-01-24 00:45:31.67494919 +0000 UTC m=+8.206936017" Jan 24 00:45:32.352825 kubelet[2863]: I0124 00:45:32.352349 2863 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-7dcd859c48-w5567" podStartSLOduration=2.063630619 podStartE2EDuration="4.352322267s" podCreationTimestamp="2026-01-24 00:45:28 +0000 UTC" firstStartedPulling="2026-01-24 00:45:29.281977264 +0000 UTC m=+5.813964081" lastFinishedPulling="2026-01-24 00:45:31.570668912 +0000 UTC m=+8.102655729" observedRunningTime="2026-01-24 00:45:31.676147763 +0000 UTC m=+8.208134580" watchObservedRunningTime="2026-01-24 00:45:32.352322267 +0000 UTC m=+8.884309134" Jan 24 00:45:33.811170 update_engine[1659]: I20260124 00:45:33.811098 1659 update_attempter.cc:509] Updating boot flags... Jan 24 00:45:37.283413 kernel: kauditd_printk_skb: 224 callbacks suppressed Jan 24 00:45:37.283560 kernel: audit: type=1106 audit(1769215537.281:521): pid=1905 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 24 00:45:37.281000 audit[1905]: USER_END pid=1905 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 24 00:45:37.282748 sudo[1905]: pam_unix(sudo:session): session closed for user root Jan 24 00:45:37.281000 audit[1905]: CRED_DISP pid=1905 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 24 00:45:37.295157 kernel: audit: type=1104 audit(1769215537.281:522): pid=1905 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 24 00:45:37.410565 sshd[1904]: Connection closed by 4.153.228.146 port 50562 Jan 24 00:45:37.411097 sshd-session[1900]: pam_unix(sshd:session): session closed for user core Jan 24 00:45:37.412000 audit[1900]: USER_END pid=1900 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:45:37.421205 kernel: audit: type=1106 audit(1769215537.412:523): pid=1900 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:45:37.421261 systemd[1]: sshd@6-65.109.167.77:22-4.153.228.146:50562.service: Deactivated successfully. Jan 24 00:45:37.424348 systemd[1]: session-8.scope: Deactivated successfully. Jan 24 00:45:37.424604 systemd[1]: session-8.scope: Consumed 4.263s CPU time, 230.7M memory peak. Jan 24 00:45:37.429128 systemd-logind[1656]: Session 8 logged out. Waiting for processes to exit. Jan 24 00:45:37.412000 audit[1900]: CRED_DISP pid=1900 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:45:37.436178 kernel: audit: type=1104 audit(1769215537.412:524): pid=1900 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:45:37.437146 systemd-logind[1656]: Removed session 8. Jan 24 00:45:37.420000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-65.109.167.77:22-4.153.228.146:50562 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:45:37.445082 kernel: audit: type=1131 audit(1769215537.420:525): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-65.109.167.77:22-4.153.228.146:50562 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:45:38.018000 audit[3278]: NETFILTER_CFG table=filter:105 family=2 entries=15 op=nft_register_rule pid=3278 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 24 00:45:38.024174 kernel: audit: type=1325 audit(1769215538.018:526): table=filter:105 family=2 entries=15 op=nft_register_rule pid=3278 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 24 00:45:38.018000 audit[3278]: SYSCALL arch=c000003e syscall=46 success=yes exit=5992 a0=3 a1=7ffc4357f690 a2=0 a3=7ffc4357f67c items=0 ppid=2972 pid=3278 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:38.033090 kernel: audit: type=1300 audit(1769215538.018:526): arch=c000003e syscall=46 success=yes exit=5992 a0=3 a1=7ffc4357f690 a2=0 a3=7ffc4357f67c items=0 ppid=2972 pid=3278 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:38.018000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 24 00:45:38.041442 kernel: audit: type=1327 audit(1769215538.018:526): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 24 00:45:38.041499 kernel: audit: type=1325 audit(1769215538.024:527): table=nat:106 family=2 entries=12 op=nft_register_rule pid=3278 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 24 00:45:38.024000 audit[3278]: NETFILTER_CFG table=nat:106 family=2 entries=12 op=nft_register_rule pid=3278 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 24 00:45:38.024000 audit[3278]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffc4357f690 a2=0 a3=0 items=0 ppid=2972 pid=3278 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:38.050082 kernel: audit: type=1300 audit(1769215538.024:527): arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffc4357f690 a2=0 a3=0 items=0 ppid=2972 pid=3278 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:38.024000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 24 00:45:38.066000 audit[3280]: NETFILTER_CFG table=filter:107 family=2 entries=16 op=nft_register_rule pid=3280 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 24 00:45:38.066000 audit[3280]: SYSCALL arch=c000003e syscall=46 success=yes exit=5992 a0=3 a1=7fffde1884c0 a2=0 a3=7fffde1884ac items=0 ppid=2972 pid=3280 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:38.066000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 24 00:45:38.069000 audit[3280]: NETFILTER_CFG table=nat:108 family=2 entries=12 op=nft_register_rule pid=3280 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 24 00:45:38.069000 audit[3280]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7fffde1884c0 a2=0 a3=0 items=0 ppid=2972 pid=3280 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:38.069000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 24 00:45:39.581000 audit[3282]: NETFILTER_CFG table=filter:109 family=2 entries=17 op=nft_register_rule pid=3282 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 24 00:45:39.581000 audit[3282]: SYSCALL arch=c000003e syscall=46 success=yes exit=6736 a0=3 a1=7ffe67c9f580 a2=0 a3=7ffe67c9f56c items=0 ppid=2972 pid=3282 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:39.581000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 24 00:45:39.588000 audit[3282]: NETFILTER_CFG table=nat:110 family=2 entries=12 op=nft_register_rule pid=3282 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 24 00:45:39.588000 audit[3282]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffe67c9f580 a2=0 a3=0 items=0 ppid=2972 pid=3282 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:39.588000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 24 00:45:40.600000 audit[3284]: NETFILTER_CFG table=filter:111 family=2 entries=19 op=nft_register_rule pid=3284 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 24 00:45:40.600000 audit[3284]: SYSCALL arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7fff5cdb7c60 a2=0 a3=7fff5cdb7c4c items=0 ppid=2972 pid=3284 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:40.600000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 24 00:45:40.605000 audit[3284]: NETFILTER_CFG table=nat:112 family=2 entries=12 op=nft_register_rule pid=3284 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 24 00:45:40.605000 audit[3284]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7fff5cdb7c60 a2=0 a3=0 items=0 ppid=2972 pid=3284 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:40.605000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 24 00:45:41.507854 systemd[1]: Created slice kubepods-besteffort-pod6101c1a7_906f_473c_863f_99c7c8933e1f.slice - libcontainer container kubepods-besteffort-pod6101c1a7_906f_473c_863f_99c7c8933e1f.slice. Jan 24 00:45:41.525000 audit[3286]: NETFILTER_CFG table=filter:113 family=2 entries=21 op=nft_register_rule pid=3286 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 24 00:45:41.525000 audit[3286]: SYSCALL arch=c000003e syscall=46 success=yes exit=8224 a0=3 a1=7ffd9c2d8db0 a2=0 a3=7ffd9c2d8d9c items=0 ppid=2972 pid=3286 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:41.525000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 24 00:45:41.537000 audit[3286]: NETFILTER_CFG table=nat:114 family=2 entries=12 op=nft_register_rule pid=3286 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 24 00:45:41.547284 kubelet[2863]: I0124 00:45:41.547163 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44d4c\" (UniqueName: \"kubernetes.io/projected/6101c1a7-906f-473c-863f-99c7c8933e1f-kube-api-access-44d4c\") pod \"calico-typha-5765fff744-8kdwh\" (UID: \"6101c1a7-906f-473c-863f-99c7c8933e1f\") " pod="calico-system/calico-typha-5765fff744-8kdwh" Jan 24 00:45:41.547284 kubelet[2863]: I0124 00:45:41.547205 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/6101c1a7-906f-473c-863f-99c7c8933e1f-typha-certs\") pod \"calico-typha-5765fff744-8kdwh\" (UID: \"6101c1a7-906f-473c-863f-99c7c8933e1f\") " pod="calico-system/calico-typha-5765fff744-8kdwh" Jan 24 00:45:41.547284 kubelet[2863]: I0124 00:45:41.547223 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6101c1a7-906f-473c-863f-99c7c8933e1f-tigera-ca-bundle\") pod \"calico-typha-5765fff744-8kdwh\" (UID: \"6101c1a7-906f-473c-863f-99c7c8933e1f\") " pod="calico-system/calico-typha-5765fff744-8kdwh" Jan 24 00:45:41.537000 audit[3286]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffd9c2d8db0 a2=0 a3=0 items=0 ppid=2972 pid=3286 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:41.537000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 24 00:45:41.697128 systemd[1]: Created slice kubepods-besteffort-pod95f5cc14_5eb2_467f_bbfa_968ce61fb23b.slice - libcontainer container kubepods-besteffort-pod95f5cc14_5eb2_467f_bbfa_968ce61fb23b.slice. Jan 24 00:45:41.748805 kubelet[2863]: I0124 00:45:41.748715 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/95f5cc14-5eb2-467f-bbfa-968ce61fb23b-lib-modules\") pod \"calico-node-5qjb2\" (UID: \"95f5cc14-5eb2-467f-bbfa-968ce61fb23b\") " pod="calico-system/calico-node-5qjb2" Jan 24 00:45:41.748805 kubelet[2863]: I0124 00:45:41.748773 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95f5cc14-5eb2-467f-bbfa-968ce61fb23b-tigera-ca-bundle\") pod \"calico-node-5qjb2\" (UID: \"95f5cc14-5eb2-467f-bbfa-968ce61fb23b\") " pod="calico-system/calico-node-5qjb2" Jan 24 00:45:41.748805 kubelet[2863]: I0124 00:45:41.748797 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/95f5cc14-5eb2-467f-bbfa-968ce61fb23b-var-run-calico\") pod \"calico-node-5qjb2\" (UID: \"95f5cc14-5eb2-467f-bbfa-968ce61fb23b\") " pod="calico-system/calico-node-5qjb2" Jan 24 00:45:41.748805 kubelet[2863]: I0124 00:45:41.748817 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/95f5cc14-5eb2-467f-bbfa-968ce61fb23b-cni-bin-dir\") pod \"calico-node-5qjb2\" (UID: \"95f5cc14-5eb2-467f-bbfa-968ce61fb23b\") " pod="calico-system/calico-node-5qjb2" Jan 24 00:45:41.749171 kubelet[2863]: I0124 00:45:41.748837 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/95f5cc14-5eb2-467f-bbfa-968ce61fb23b-cni-log-dir\") pod \"calico-node-5qjb2\" (UID: \"95f5cc14-5eb2-467f-bbfa-968ce61fb23b\") " pod="calico-system/calico-node-5qjb2" Jan 24 00:45:41.749171 kubelet[2863]: I0124 00:45:41.748855 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/95f5cc14-5eb2-467f-bbfa-968ce61fb23b-xtables-lock\") pod \"calico-node-5qjb2\" (UID: \"95f5cc14-5eb2-467f-bbfa-968ce61fb23b\") " pod="calico-system/calico-node-5qjb2" Jan 24 00:45:41.749171 kubelet[2863]: I0124 00:45:41.748873 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc777\" (UniqueName: \"kubernetes.io/projected/95f5cc14-5eb2-467f-bbfa-968ce61fb23b-kube-api-access-bc777\") pod \"calico-node-5qjb2\" (UID: \"95f5cc14-5eb2-467f-bbfa-968ce61fb23b\") " pod="calico-system/calico-node-5qjb2" Jan 24 00:45:41.749171 kubelet[2863]: I0124 00:45:41.748899 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/95f5cc14-5eb2-467f-bbfa-968ce61fb23b-cni-net-dir\") pod \"calico-node-5qjb2\" (UID: \"95f5cc14-5eb2-467f-bbfa-968ce61fb23b\") " pod="calico-system/calico-node-5qjb2" Jan 24 00:45:41.749171 kubelet[2863]: I0124 00:45:41.748923 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/95f5cc14-5eb2-467f-bbfa-968ce61fb23b-var-lib-calico\") pod \"calico-node-5qjb2\" (UID: \"95f5cc14-5eb2-467f-bbfa-968ce61fb23b\") " pod="calico-system/calico-node-5qjb2" Jan 24 00:45:41.749365 kubelet[2863]: I0124 00:45:41.748942 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/95f5cc14-5eb2-467f-bbfa-968ce61fb23b-flexvol-driver-host\") pod \"calico-node-5qjb2\" (UID: \"95f5cc14-5eb2-467f-bbfa-968ce61fb23b\") " pod="calico-system/calico-node-5qjb2" Jan 24 00:45:41.749365 kubelet[2863]: I0124 00:45:41.748960 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/95f5cc14-5eb2-467f-bbfa-968ce61fb23b-policysync\") pod \"calico-node-5qjb2\" (UID: \"95f5cc14-5eb2-467f-bbfa-968ce61fb23b\") " pod="calico-system/calico-node-5qjb2" Jan 24 00:45:41.749365 kubelet[2863]: I0124 00:45:41.748986 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/95f5cc14-5eb2-467f-bbfa-968ce61fb23b-node-certs\") pod \"calico-node-5qjb2\" (UID: \"95f5cc14-5eb2-467f-bbfa-968ce61fb23b\") " pod="calico-system/calico-node-5qjb2" Jan 24 00:45:41.820717 containerd[1682]: time="2026-01-24T00:45:41.819417372Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5765fff744-8kdwh,Uid:6101c1a7-906f-473c-863f-99c7c8933e1f,Namespace:calico-system,Attempt:0,}" Jan 24 00:45:41.853532 kubelet[2863]: E0124 00:45:41.852855 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:41.853532 kubelet[2863]: W0124 00:45:41.852895 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:41.853532 kubelet[2863]: E0124 00:45:41.852927 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:41.855044 kubelet[2863]: E0124 00:45:41.854913 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:41.855161 kubelet[2863]: W0124 00:45:41.854943 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:41.855161 kubelet[2863]: E0124 00:45:41.855123 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:41.857040 kubelet[2863]: E0124 00:45:41.856959 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:41.858764 containerd[1682]: time="2026-01-24T00:45:41.856952353Z" level=info msg="connecting to shim b7042f6baaf09302f1d7ae7217794267e477ef5d6c3e4348caed72c75bf9638b" address="unix:///run/containerd/s/5202fc3955b07712f0cdc6768612a15f3054195547f6a0f99509e2d0122c4b21" namespace=k8s.io protocol=ttrpc version=3 Jan 24 00:45:41.858871 kubelet[2863]: W0124 00:45:41.857352 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:41.858871 kubelet[2863]: E0124 00:45:41.858414 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:41.862618 kubelet[2863]: E0124 00:45:41.861466 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:41.862618 kubelet[2863]: W0124 00:45:41.861555 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:41.862618 kubelet[2863]: E0124 00:45:41.861630 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:41.865121 kubelet[2863]: E0124 00:45:41.864374 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:41.865121 kubelet[2863]: W0124 00:45:41.864404 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:41.865491 kubelet[2863]: E0124 00:45:41.865292 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:41.866656 kubelet[2863]: E0124 00:45:41.866568 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:41.866656 kubelet[2863]: W0124 00:45:41.866597 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:41.866656 kubelet[2863]: E0124 00:45:41.866622 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:41.869509 kubelet[2863]: E0124 00:45:41.868858 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:41.870099 kubelet[2863]: W0124 00:45:41.869719 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:41.871177 kubelet[2863]: E0124 00:45:41.869931 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:41.874300 kubelet[2863]: E0124 00:45:41.874272 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:41.875401 kubelet[2863]: W0124 00:45:41.875253 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:41.876197 kubelet[2863]: E0124 00:45:41.876161 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:41.878691 kubelet[2863]: E0124 00:45:41.878319 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:41.878691 kubelet[2863]: W0124 00:45:41.878349 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:41.878691 kubelet[2863]: E0124 00:45:41.878377 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:41.882701 kubelet[2863]: E0124 00:45:41.882551 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:41.882701 kubelet[2863]: W0124 00:45:41.882586 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:41.882701 kubelet[2863]: E0124 00:45:41.882612 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:41.887110 kubelet[2863]: E0124 00:45:41.885861 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:41.887110 kubelet[2863]: W0124 00:45:41.885884 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:41.887110 kubelet[2863]: E0124 00:45:41.885905 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:41.889799 kubelet[2863]: E0124 00:45:41.889539 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:41.889799 kubelet[2863]: W0124 00:45:41.889592 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:41.889799 kubelet[2863]: E0124 00:45:41.889609 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:41.890793 kubelet[2863]: E0124 00:45:41.890745 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:41.890793 kubelet[2863]: W0124 00:45:41.890762 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:41.890994 kubelet[2863]: E0124 00:45:41.890917 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:41.893087 kubelet[2863]: E0124 00:45:41.892880 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:41.893087 kubelet[2863]: W0124 00:45:41.893039 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:41.893875 kubelet[2863]: E0124 00:45:41.893057 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:41.896767 kubelet[2863]: E0124 00:45:41.895913 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:41.900446 kubelet[2863]: W0124 00:45:41.898935 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:41.900446 kubelet[2863]: E0124 00:45:41.898967 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:41.900633 kubelet[2863]: E0124 00:45:41.900613 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:41.900909 kubelet[2863]: W0124 00:45:41.900720 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:41.900909 kubelet[2863]: E0124 00:45:41.900776 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:41.903537 kubelet[2863]: E0124 00:45:41.903515 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:41.903653 kubelet[2863]: W0124 00:45:41.903638 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:41.903756 kubelet[2863]: E0124 00:45:41.903741 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:41.906277 kubelet[2863]: E0124 00:45:41.906164 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:41.907846 kubelet[2863]: W0124 00:45:41.907374 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:41.907846 kubelet[2863]: E0124 00:45:41.907644 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:41.909034 kubelet[2863]: E0124 00:45:41.908900 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:41.910468 kubelet[2863]: W0124 00:45:41.910086 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:41.910468 kubelet[2863]: E0124 00:45:41.910100 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:41.912230 kubelet[2863]: E0124 00:45:41.912218 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:41.913113 kubelet[2863]: W0124 00:45:41.913086 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:41.913199 kubelet[2863]: E0124 00:45:41.913187 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:41.915385 kubelet[2863]: E0124 00:45:41.915296 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:41.915385 kubelet[2863]: W0124 00:45:41.915306 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:41.915385 kubelet[2863]: E0124 00:45:41.915314 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:41.917111 kubelet[2863]: E0124 00:45:41.916184 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:41.917111 kubelet[2863]: W0124 00:45:41.916194 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:41.917111 kubelet[2863]: E0124 00:45:41.916201 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:41.917284 kubelet[2863]: E0124 00:45:41.917274 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:41.917411 kubelet[2863]: W0124 00:45:41.917332 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:41.917411 kubelet[2863]: E0124 00:45:41.917341 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:41.917740 kubelet[2863]: E0124 00:45:41.917674 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:41.917740 kubelet[2863]: W0124 00:45:41.917683 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:41.917740 kubelet[2863]: E0124 00:45:41.917691 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:41.918618 kubelet[2863]: E0124 00:45:41.918360 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:41.918618 kubelet[2863]: W0124 00:45:41.918370 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:41.918618 kubelet[2863]: E0124 00:45:41.918378 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:41.920257 kubelet[2863]: E0124 00:45:41.920130 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:41.920257 kubelet[2863]: W0124 00:45:41.920142 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:41.920257 kubelet[2863]: E0124 00:45:41.920150 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:41.921092 kubelet[2863]: E0124 00:45:41.920737 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:41.921092 kubelet[2863]: W0124 00:45:41.920747 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:41.921092 kubelet[2863]: E0124 00:45:41.920765 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:41.921414 kubelet[2863]: E0124 00:45:41.921363 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:41.921905 kubelet[2863]: W0124 00:45:41.921879 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:41.923759 kubelet[2863]: E0124 00:45:41.923129 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:41.924151 kubelet[2863]: E0124 00:45:41.924133 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:41.924501 kubelet[2863]: W0124 00:45:41.924484 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:41.924646 kubelet[2863]: E0124 00:45:41.924635 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:41.926365 kubelet[2863]: E0124 00:45:41.926325 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:41.927054 kubelet[2863]: W0124 00:45:41.927041 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:41.927133 kubelet[2863]: E0124 00:45:41.927124 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:41.928906 kubelet[2863]: E0124 00:45:41.928600 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:41.928906 kubelet[2863]: W0124 00:45:41.928735 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:41.928906 kubelet[2863]: E0124 00:45:41.928747 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:41.930139 kubelet[2863]: E0124 00:45:41.930128 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:41.930568 kubelet[2863]: W0124 00:45:41.930255 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:41.930568 kubelet[2863]: E0124 00:45:41.930383 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:41.934085 kubelet[2863]: E0124 00:45:41.933240 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:41.934085 kubelet[2863]: W0124 00:45:41.933271 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:41.934085 kubelet[2863]: E0124 00:45:41.933302 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:41.934594 kubelet[2863]: E0124 00:45:41.934345 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:41.934594 kubelet[2863]: W0124 00:45:41.934362 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:41.934594 kubelet[2863]: E0124 00:45:41.934378 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:41.935498 kubelet[2863]: E0124 00:45:41.935274 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:41.935498 kubelet[2863]: W0124 00:45:41.935283 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:41.935498 kubelet[2863]: E0124 00:45:41.935293 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:41.936298 kubelet[2863]: E0124 00:45:41.936266 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:41.936298 kubelet[2863]: W0124 00:45:41.936276 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:41.936298 kubelet[2863]: E0124 00:45:41.936285 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:41.937050 kubelet[2863]: E0124 00:45:41.936672 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:41.937147 kubelet[2863]: W0124 00:45:41.937116 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:41.937147 kubelet[2863]: E0124 00:45:41.937130 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:41.947481 kubelet[2863]: E0124 00:45:41.947322 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:45:41.954812 kubelet[2863]: I0124 00:45:41.954761 2863 status_manager.go:895] "Failed to get status for pod" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" pod="calico-system/csi-node-driver-njf24" err="pods \"csi-node-driver-njf24\" is forbidden: User \"system:node:ci-4593-0-0-9-1308b066bf\" cannot get resource \"pods\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ci-4593-0-0-9-1308b066bf' and this object" Jan 24 00:45:41.956576 systemd[1]: Started cri-containerd-b7042f6baaf09302f1d7ae7217794267e477ef5d6c3e4348caed72c75bf9638b.scope - libcontainer container b7042f6baaf09302f1d7ae7217794267e477ef5d6c3e4348caed72c75bf9638b. Jan 24 00:45:41.969286 kubelet[2863]: E0124 00:45:41.969126 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:41.969286 kubelet[2863]: W0124 00:45:41.969140 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:41.969286 kubelet[2863]: E0124 00:45:41.969153 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:41.989000 audit: BPF prog-id=151 op=LOAD Jan 24 00:45:41.990000 audit: BPF prog-id=152 op=LOAD Jan 24 00:45:41.990000 audit[3322]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a8238 a2=98 a3=0 items=0 ppid=3297 pid=3322 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:41.990000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6237303432663662616166303933303266316437616537323137373934 Jan 24 00:45:41.990000 audit: BPF prog-id=152 op=UNLOAD Jan 24 00:45:41.990000 audit[3322]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3297 pid=3322 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:41.990000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6237303432663662616166303933303266316437616537323137373934 Jan 24 00:45:41.990000 audit: BPF prog-id=153 op=LOAD Jan 24 00:45:41.990000 audit[3322]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a8488 a2=98 a3=0 items=0 ppid=3297 pid=3322 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:41.990000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6237303432663662616166303933303266316437616537323137373934 Jan 24 00:45:41.990000 audit: BPF prog-id=154 op=LOAD Jan 24 00:45:41.990000 audit[3322]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001a8218 a2=98 a3=0 items=0 ppid=3297 pid=3322 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:41.990000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6237303432663662616166303933303266316437616537323137373934 Jan 24 00:45:41.990000 audit: BPF prog-id=154 op=UNLOAD Jan 24 00:45:41.990000 audit[3322]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3297 pid=3322 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:41.990000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6237303432663662616166303933303266316437616537323137373934 Jan 24 00:45:41.990000 audit: BPF prog-id=153 op=UNLOAD Jan 24 00:45:41.990000 audit[3322]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3297 pid=3322 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:41.990000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6237303432663662616166303933303266316437616537323137373934 Jan 24 00:45:41.990000 audit: BPF prog-id=155 op=LOAD Jan 24 00:45:41.990000 audit[3322]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a86e8 a2=98 a3=0 items=0 ppid=3297 pid=3322 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:41.990000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6237303432663662616166303933303266316437616537323137373934 Jan 24 00:45:42.001193 containerd[1682]: time="2026-01-24T00:45:42.001166113Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-5qjb2,Uid:95f5cc14-5eb2-467f-bbfa-968ce61fb23b,Namespace:calico-system,Attempt:0,}" Jan 24 00:45:42.022717 kubelet[2863]: E0124 00:45:42.022552 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:42.022717 kubelet[2863]: W0124 00:45:42.022614 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:42.022717 kubelet[2863]: E0124 00:45:42.022632 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:42.023136 kubelet[2863]: E0124 00:45:42.023125 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:42.023247 kubelet[2863]: W0124 00:45:42.023192 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:42.023247 kubelet[2863]: E0124 00:45:42.023202 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:42.024279 kubelet[2863]: E0124 00:45:42.024243 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:42.024279 kubelet[2863]: W0124 00:45:42.024277 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:42.024353 kubelet[2863]: E0124 00:45:42.024295 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:42.024587 kubelet[2863]: E0124 00:45:42.024570 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:42.024609 kubelet[2863]: W0124 00:45:42.024588 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:42.024609 kubelet[2863]: E0124 00:45:42.024596 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:42.024909 kubelet[2863]: E0124 00:45:42.024890 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:42.024909 kubelet[2863]: W0124 00:45:42.024902 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:42.024909 kubelet[2863]: E0124 00:45:42.024909 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:42.027848 kubelet[2863]: E0124 00:45:42.027271 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:42.027848 kubelet[2863]: W0124 00:45:42.027283 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:42.027848 kubelet[2863]: E0124 00:45:42.027293 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:42.027848 kubelet[2863]: E0124 00:45:42.027467 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:42.027848 kubelet[2863]: W0124 00:45:42.027482 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:42.027848 kubelet[2863]: E0124 00:45:42.027489 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:42.027848 kubelet[2863]: E0124 00:45:42.027688 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:42.027848 kubelet[2863]: W0124 00:45:42.027694 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:42.027848 kubelet[2863]: E0124 00:45:42.027701 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:42.028097 kubelet[2863]: E0124 00:45:42.027889 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:42.028097 kubelet[2863]: W0124 00:45:42.027895 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:42.028097 kubelet[2863]: E0124 00:45:42.027903 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:42.028539 kubelet[2863]: E0124 00:45:42.028512 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:42.028539 kubelet[2863]: W0124 00:45:42.028525 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:42.028539 kubelet[2863]: E0124 00:45:42.028535 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:42.029305 kubelet[2863]: E0124 00:45:42.029217 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:42.029305 kubelet[2863]: W0124 00:45:42.029248 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:42.029305 kubelet[2863]: E0124 00:45:42.029258 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:42.029499 kubelet[2863]: E0124 00:45:42.029455 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:42.029499 kubelet[2863]: W0124 00:45:42.029466 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:42.029499 kubelet[2863]: E0124 00:45:42.029496 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:42.030390 kubelet[2863]: E0124 00:45:42.030360 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:42.030390 kubelet[2863]: W0124 00:45:42.030375 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:42.030390 kubelet[2863]: E0124 00:45:42.030384 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:42.030707 kubelet[2863]: E0124 00:45:42.030644 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:42.030707 kubelet[2863]: W0124 00:45:42.030655 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:42.030707 kubelet[2863]: E0124 00:45:42.030662 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:42.031660 kubelet[2863]: E0124 00:45:42.031519 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:42.031660 kubelet[2863]: W0124 00:45:42.031533 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:42.031660 kubelet[2863]: E0124 00:45:42.031541 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:42.031750 kubelet[2863]: E0124 00:45:42.031700 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:42.031750 kubelet[2863]: W0124 00:45:42.031706 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:42.031750 kubelet[2863]: E0124 00:45:42.031712 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:42.031911 kubelet[2863]: E0124 00:45:42.031863 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:42.031911 kubelet[2863]: W0124 00:45:42.031888 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:42.031911 kubelet[2863]: E0124 00:45:42.031894 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:42.032093 kubelet[2863]: E0124 00:45:42.032024 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:42.032093 kubelet[2863]: W0124 00:45:42.032029 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:42.032093 kubelet[2863]: E0124 00:45:42.032035 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:42.032226 kubelet[2863]: E0124 00:45:42.032210 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:42.032226 kubelet[2863]: W0124 00:45:42.032223 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:42.032226 kubelet[2863]: E0124 00:45:42.032229 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:42.033342 kubelet[2863]: E0124 00:45:42.033325 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:42.033342 kubelet[2863]: W0124 00:45:42.033339 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:42.033403 kubelet[2863]: E0124 00:45:42.033347 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:42.039861 containerd[1682]: time="2026-01-24T00:45:42.039776690Z" level=info msg="connecting to shim 006723ff522a1ec89aa42472e96a2a108256b3c2490e6db62e86e8b1e883a13e" address="unix:///run/containerd/s/8e909fb5a93aa971aa66fc790686e79852f192cb3166895f4fc8c6f8bd071b5f" namespace=k8s.io protocol=ttrpc version=3 Jan 24 00:45:42.044920 containerd[1682]: time="2026-01-24T00:45:42.044881249Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5765fff744-8kdwh,Uid:6101c1a7-906f-473c-863f-99c7c8933e1f,Namespace:calico-system,Attempt:0,} returns sandbox id \"b7042f6baaf09302f1d7ae7217794267e477ef5d6c3e4348caed72c75bf9638b\"" Jan 24 00:45:42.046386 containerd[1682]: time="2026-01-24T00:45:42.046358475Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\"" Jan 24 00:45:42.053681 kubelet[2863]: E0124 00:45:42.053634 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:42.053681 kubelet[2863]: W0124 00:45:42.053651 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:42.053681 kubelet[2863]: E0124 00:45:42.053682 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:42.053766 kubelet[2863]: I0124 00:45:42.053704 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc-kubelet-dir\") pod \"csi-node-driver-njf24\" (UID: \"12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc\") " pod="calico-system/csi-node-driver-njf24" Jan 24 00:45:42.053958 kubelet[2863]: E0124 00:45:42.053934 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:42.053958 kubelet[2863]: W0124 00:45:42.053946 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:42.053958 kubelet[2863]: E0124 00:45:42.053954 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:42.054206 kubelet[2863]: I0124 00:45:42.053973 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fs8rv\" (UniqueName: \"kubernetes.io/projected/12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc-kube-api-access-fs8rv\") pod \"csi-node-driver-njf24\" (UID: \"12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc\") " pod="calico-system/csi-node-driver-njf24" Jan 24 00:45:42.054332 kubelet[2863]: E0124 00:45:42.054315 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:42.054332 kubelet[2863]: W0124 00:45:42.054325 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:42.054437 kubelet[2863]: E0124 00:45:42.054333 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:42.054574 kubelet[2863]: E0124 00:45:42.054559 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:42.054574 kubelet[2863]: W0124 00:45:42.054570 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:42.054705 kubelet[2863]: E0124 00:45:42.054577 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:42.054853 kubelet[2863]: E0124 00:45:42.054797 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:42.054853 kubelet[2863]: W0124 00:45:42.054817 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:42.054853 kubelet[2863]: E0124 00:45:42.054826 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:42.054853 kubelet[2863]: I0124 00:45:42.054844 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc-varrun\") pod \"csi-node-driver-njf24\" (UID: \"12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc\") " pod="calico-system/csi-node-driver-njf24" Jan 24 00:45:42.055076 kubelet[2863]: E0124 00:45:42.055039 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:42.055076 kubelet[2863]: W0124 00:45:42.055048 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:42.055151 kubelet[2863]: E0124 00:45:42.055082 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:42.055278 kubelet[2863]: E0124 00:45:42.055252 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:42.055278 kubelet[2863]: W0124 00:45:42.055263 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:42.055278 kubelet[2863]: E0124 00:45:42.055269 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:42.055456 kubelet[2863]: E0124 00:45:42.055437 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:42.055456 kubelet[2863]: W0124 00:45:42.055446 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:42.055456 kubelet[2863]: E0124 00:45:42.055453 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:42.055678 kubelet[2863]: I0124 00:45:42.055466 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc-socket-dir\") pod \"csi-node-driver-njf24\" (UID: \"12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc\") " pod="calico-system/csi-node-driver-njf24" Jan 24 00:45:42.055704 kubelet[2863]: E0124 00:45:42.055697 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:42.055742 kubelet[2863]: W0124 00:45:42.055704 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:42.055742 kubelet[2863]: E0124 00:45:42.055711 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:42.055742 kubelet[2863]: I0124 00:45:42.055728 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc-registration-dir\") pod \"csi-node-driver-njf24\" (UID: \"12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc\") " pod="calico-system/csi-node-driver-njf24" Jan 24 00:45:42.055973 kubelet[2863]: E0124 00:45:42.055953 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:42.055973 kubelet[2863]: W0124 00:45:42.055966 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:42.055973 kubelet[2863]: E0124 00:45:42.055973 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:42.056741 kubelet[2863]: E0124 00:45:42.056212 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:42.056741 kubelet[2863]: W0124 00:45:42.056221 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:42.056741 kubelet[2863]: E0124 00:45:42.056227 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:42.056741 kubelet[2863]: E0124 00:45:42.056413 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:42.056741 kubelet[2863]: W0124 00:45:42.056420 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:42.056741 kubelet[2863]: E0124 00:45:42.056425 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:42.056741 kubelet[2863]: E0124 00:45:42.056630 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:42.056741 kubelet[2863]: W0124 00:45:42.056636 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:42.056741 kubelet[2863]: E0124 00:45:42.056643 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:42.056929 kubelet[2863]: E0124 00:45:42.056813 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:42.056929 kubelet[2863]: W0124 00:45:42.056819 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:42.056929 kubelet[2863]: E0124 00:45:42.056824 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:42.057001 kubelet[2863]: E0124 00:45:42.056976 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:42.057001 kubelet[2863]: W0124 00:45:42.056982 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:42.057001 kubelet[2863]: E0124 00:45:42.056987 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:42.072553 systemd[1]: Started cri-containerd-006723ff522a1ec89aa42472e96a2a108256b3c2490e6db62e86e8b1e883a13e.scope - libcontainer container 006723ff522a1ec89aa42472e96a2a108256b3c2490e6db62e86e8b1e883a13e. Jan 24 00:45:42.083000 audit: BPF prog-id=156 op=LOAD Jan 24 00:45:42.083000 audit: BPF prog-id=157 op=LOAD Jan 24 00:45:42.083000 audit[3423]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=3412 pid=3423 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:42.083000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3030363732336666353232613165633839616134323437326539366132 Jan 24 00:45:42.083000 audit: BPF prog-id=157 op=UNLOAD Jan 24 00:45:42.083000 audit[3423]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3412 pid=3423 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:42.083000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3030363732336666353232613165633839616134323437326539366132 Jan 24 00:45:42.083000 audit: BPF prog-id=158 op=LOAD Jan 24 00:45:42.083000 audit[3423]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=3412 pid=3423 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:42.083000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3030363732336666353232613165633839616134323437326539366132 Jan 24 00:45:42.083000 audit: BPF prog-id=159 op=LOAD Jan 24 00:45:42.083000 audit[3423]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=3412 pid=3423 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:42.083000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3030363732336666353232613165633839616134323437326539366132 Jan 24 00:45:42.083000 audit: BPF prog-id=159 op=UNLOAD Jan 24 00:45:42.083000 audit[3423]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3412 pid=3423 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:42.083000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3030363732336666353232613165633839616134323437326539366132 Jan 24 00:45:42.083000 audit: BPF prog-id=158 op=UNLOAD Jan 24 00:45:42.083000 audit[3423]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3412 pid=3423 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:42.083000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3030363732336666353232613165633839616134323437326539366132 Jan 24 00:45:42.083000 audit: BPF prog-id=160 op=LOAD Jan 24 00:45:42.083000 audit[3423]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=3412 pid=3423 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:42.083000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3030363732336666353232613165633839616134323437326539366132 Jan 24 00:45:42.099120 containerd[1682]: time="2026-01-24T00:45:42.099044088Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-5qjb2,Uid:95f5cc14-5eb2-467f-bbfa-968ce61fb23b,Namespace:calico-system,Attempt:0,} returns sandbox id \"006723ff522a1ec89aa42472e96a2a108256b3c2490e6db62e86e8b1e883a13e\"" Jan 24 00:45:42.157242 kubelet[2863]: E0124 00:45:42.157192 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:42.157331 kubelet[2863]: W0124 00:45:42.157230 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:42.157331 kubelet[2863]: E0124 00:45:42.157307 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:42.158105 kubelet[2863]: E0124 00:45:42.158043 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:42.158141 kubelet[2863]: W0124 00:45:42.158130 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:42.158161 kubelet[2863]: E0124 00:45:42.158150 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:42.158835 kubelet[2863]: E0124 00:45:42.158807 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:42.158835 kubelet[2863]: W0124 00:45:42.158832 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:42.158901 kubelet[2863]: E0124 00:45:42.158850 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:42.159482 kubelet[2863]: E0124 00:45:42.159436 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:42.159482 kubelet[2863]: W0124 00:45:42.159461 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:42.159566 kubelet[2863]: E0124 00:45:42.159539 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:42.160432 kubelet[2863]: E0124 00:45:42.160359 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:42.160432 kubelet[2863]: W0124 00:45:42.160379 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:42.160432 kubelet[2863]: E0124 00:45:42.160395 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:42.161032 kubelet[2863]: E0124 00:45:42.160911 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:42.161032 kubelet[2863]: W0124 00:45:42.160931 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:42.161032 kubelet[2863]: E0124 00:45:42.160947 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:42.161364 kubelet[2863]: E0124 00:45:42.161354 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:42.161625 kubelet[2863]: W0124 00:45:42.161454 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:42.161625 kubelet[2863]: E0124 00:45:42.161466 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:42.162002 kubelet[2863]: E0124 00:45:42.161975 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:42.162002 kubelet[2863]: W0124 00:45:42.161984 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:42.162002 kubelet[2863]: E0124 00:45:42.161991 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:42.162814 kubelet[2863]: E0124 00:45:42.162782 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:42.162814 kubelet[2863]: W0124 00:45:42.162794 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:42.162814 kubelet[2863]: E0124 00:45:42.162802 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:42.163181 kubelet[2863]: E0124 00:45:42.162975 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:42.163181 kubelet[2863]: W0124 00:45:42.162982 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:42.163181 kubelet[2863]: E0124 00:45:42.162989 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:42.163975 kubelet[2863]: E0124 00:45:42.163221 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:42.163975 kubelet[2863]: W0124 00:45:42.163227 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:42.163975 kubelet[2863]: E0124 00:45:42.163234 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:42.163975 kubelet[2863]: E0124 00:45:42.163396 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:42.163975 kubelet[2863]: W0124 00:45:42.163402 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:42.163975 kubelet[2863]: E0124 00:45:42.163408 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:42.163975 kubelet[2863]: E0124 00:45:42.163570 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:42.163975 kubelet[2863]: W0124 00:45:42.163577 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:42.163975 kubelet[2863]: E0124 00:45:42.163583 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:42.163975 kubelet[2863]: E0124 00:45:42.163917 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:42.164409 kubelet[2863]: W0124 00:45:42.163925 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:42.164409 kubelet[2863]: E0124 00:45:42.163933 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:42.164409 kubelet[2863]: E0124 00:45:42.164201 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:42.164409 kubelet[2863]: W0124 00:45:42.164208 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:42.164409 kubelet[2863]: E0124 00:45:42.164215 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:42.164409 kubelet[2863]: E0124 00:45:42.164415 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:42.164634 kubelet[2863]: W0124 00:45:42.164421 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:42.164634 kubelet[2863]: E0124 00:45:42.164429 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:42.166090 kubelet[2863]: E0124 00:45:42.164776 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:42.166090 kubelet[2863]: W0124 00:45:42.164787 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:42.166090 kubelet[2863]: E0124 00:45:42.164794 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:42.166090 kubelet[2863]: E0124 00:45:42.165023 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:42.166090 kubelet[2863]: W0124 00:45:42.165030 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:42.166090 kubelet[2863]: E0124 00:45:42.165036 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:42.166090 kubelet[2863]: E0124 00:45:42.165210 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:42.166090 kubelet[2863]: W0124 00:45:42.165216 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:42.166090 kubelet[2863]: E0124 00:45:42.165222 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:42.166090 kubelet[2863]: E0124 00:45:42.165401 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:42.166554 kubelet[2863]: W0124 00:45:42.165406 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:42.166554 kubelet[2863]: E0124 00:45:42.165412 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:42.166554 kubelet[2863]: E0124 00:45:42.165589 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:42.166554 kubelet[2863]: W0124 00:45:42.165595 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:42.166554 kubelet[2863]: E0124 00:45:42.165601 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:42.166554 kubelet[2863]: E0124 00:45:42.165786 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:42.166554 kubelet[2863]: W0124 00:45:42.165792 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:42.166554 kubelet[2863]: E0124 00:45:42.165798 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:42.166554 kubelet[2863]: E0124 00:45:42.165980 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:42.166554 kubelet[2863]: W0124 00:45:42.165986 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:42.166919 kubelet[2863]: E0124 00:45:42.165992 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:42.166919 kubelet[2863]: E0124 00:45:42.166168 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:42.166919 kubelet[2863]: W0124 00:45:42.166176 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:42.166919 kubelet[2863]: E0124 00:45:42.166182 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:42.166919 kubelet[2863]: E0124 00:45:42.166357 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:42.166919 kubelet[2863]: W0124 00:45:42.166362 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:42.166919 kubelet[2863]: E0124 00:45:42.166370 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:42.174467 kubelet[2863]: E0124 00:45:42.174347 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:42.174467 kubelet[2863]: W0124 00:45:42.174394 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:42.174467 kubelet[2863]: E0124 00:45:42.174417 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:42.601415 kernel: kauditd_printk_skb: 69 callbacks suppressed Jan 24 00:45:42.601615 kernel: audit: type=1325 audit(1769215542.588:552): table=filter:115 family=2 entries=22 op=nft_register_rule pid=3492 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 24 00:45:42.588000 audit[3492]: NETFILTER_CFG table=filter:115 family=2 entries=22 op=nft_register_rule pid=3492 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 24 00:45:42.588000 audit[3492]: SYSCALL arch=c000003e syscall=46 success=yes exit=8224 a0=3 a1=7ffc56c0b0b0 a2=0 a3=7ffc56c0b09c items=0 ppid=2972 pid=3492 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:42.588000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 24 00:45:42.633875 kernel: audit: type=1300 audit(1769215542.588:552): arch=c000003e syscall=46 success=yes exit=8224 a0=3 a1=7ffc56c0b0b0 a2=0 a3=7ffc56c0b09c items=0 ppid=2972 pid=3492 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:42.633938 kernel: audit: type=1327 audit(1769215542.588:552): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 24 00:45:42.610000 audit[3492]: NETFILTER_CFG table=nat:116 family=2 entries=12 op=nft_register_rule pid=3492 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 24 00:45:42.645429 kernel: audit: type=1325 audit(1769215542.610:553): table=nat:116 family=2 entries=12 op=nft_register_rule pid=3492 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 24 00:45:42.664567 kernel: audit: type=1300 audit(1769215542.610:553): arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffc56c0b0b0 a2=0 a3=0 items=0 ppid=2972 pid=3492 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:42.610000 audit[3492]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffc56c0b0b0 a2=0 a3=0 items=0 ppid=2972 pid=3492 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:42.610000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 24 00:45:42.671148 kernel: audit: type=1327 audit(1769215542.610:553): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 24 00:45:43.619111 kubelet[2863]: E0124 00:45:43.618926 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:45:43.858419 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3636187710.mount: Deactivated successfully. Jan 24 00:45:45.244267 containerd[1682]: time="2026-01-24T00:45:45.244217220Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 24 00:45:45.245207 containerd[1682]: time="2026-01-24T00:45:45.245186849Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.4: active requests=0, bytes read=33735893" Jan 24 00:45:45.245978 containerd[1682]: time="2026-01-24T00:45:45.245947027Z" level=info msg="ImageCreate event name:\"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 24 00:45:45.247348 containerd[1682]: time="2026-01-24T00:45:45.247327145Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 24 00:45:45.248102 containerd[1682]: time="2026-01-24T00:45:45.247724864Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.4\" with image id \"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\", size \"35234482\" in 3.201338469s" Jan 24 00:45:45.248181 containerd[1682]: time="2026-01-24T00:45:45.248156033Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\" returns image reference \"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\"" Jan 24 00:45:45.249759 containerd[1682]: time="2026-01-24T00:45:45.249737310Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\"" Jan 24 00:45:45.263421 containerd[1682]: time="2026-01-24T00:45:45.263392325Z" level=info msg="CreateContainer within sandbox \"b7042f6baaf09302f1d7ae7217794267e477ef5d6c3e4348caed72c75bf9638b\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Jan 24 00:45:45.272654 containerd[1682]: time="2026-01-24T00:45:45.270086823Z" level=info msg="Container 738d29ed1e89a324bdb52fd8e5619a3acea6c553c143938ec62480d9ff877f28: CDI devices from CRI Config.CDIDevices: []" Jan 24 00:45:45.277388 containerd[1682]: time="2026-01-24T00:45:45.277360890Z" level=info msg="CreateContainer within sandbox \"b7042f6baaf09302f1d7ae7217794267e477ef5d6c3e4348caed72c75bf9638b\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"738d29ed1e89a324bdb52fd8e5619a3acea6c553c143938ec62480d9ff877f28\"" Jan 24 00:45:45.278361 containerd[1682]: time="2026-01-24T00:45:45.277657269Z" level=info msg="StartContainer for \"738d29ed1e89a324bdb52fd8e5619a3acea6c553c143938ec62480d9ff877f28\"" Jan 24 00:45:45.278472 containerd[1682]: time="2026-01-24T00:45:45.278452457Z" level=info msg="connecting to shim 738d29ed1e89a324bdb52fd8e5619a3acea6c553c143938ec62480d9ff877f28" address="unix:///run/containerd/s/5202fc3955b07712f0cdc6768612a15f3054195547f6a0f99509e2d0122c4b21" protocol=ttrpc version=3 Jan 24 00:45:45.297176 systemd[1]: Started cri-containerd-738d29ed1e89a324bdb52fd8e5619a3acea6c553c143938ec62480d9ff877f28.scope - libcontainer container 738d29ed1e89a324bdb52fd8e5619a3acea6c553c143938ec62480d9ff877f28. Jan 24 00:45:45.310000 audit: BPF prog-id=161 op=LOAD Jan 24 00:45:45.315086 kernel: audit: type=1334 audit(1769215545.310:554): prog-id=161 op=LOAD Jan 24 00:45:45.315131 kernel: audit: type=1334 audit(1769215545.311:555): prog-id=162 op=LOAD Jan 24 00:45:45.311000 audit: BPF prog-id=162 op=LOAD Jan 24 00:45:45.311000 audit[3508]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b0238 a2=98 a3=0 items=0 ppid=3297 pid=3508 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:45.323329 kernel: audit: type=1300 audit(1769215545.311:555): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b0238 a2=98 a3=0 items=0 ppid=3297 pid=3508 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:45.330615 kernel: audit: type=1327 audit(1769215545.311:555): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3733386432396564316538396133323462646235326664386535363139 Jan 24 00:45:45.311000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3733386432396564316538396133323462646235326664386535363139 Jan 24 00:45:45.311000 audit: BPF prog-id=162 op=UNLOAD Jan 24 00:45:45.311000 audit[3508]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3297 pid=3508 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:45.311000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3733386432396564316538396133323462646235326664386535363139 Jan 24 00:45:45.311000 audit: BPF prog-id=163 op=LOAD Jan 24 00:45:45.311000 audit[3508]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b0488 a2=98 a3=0 items=0 ppid=3297 pid=3508 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:45.311000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3733386432396564316538396133323462646235326664386535363139 Jan 24 00:45:45.311000 audit: BPF prog-id=164 op=LOAD Jan 24 00:45:45.311000 audit[3508]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001b0218 a2=98 a3=0 items=0 ppid=3297 pid=3508 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:45.311000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3733386432396564316538396133323462646235326664386535363139 Jan 24 00:45:45.311000 audit: BPF prog-id=164 op=UNLOAD Jan 24 00:45:45.311000 audit[3508]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3297 pid=3508 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:45.311000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3733386432396564316538396133323462646235326664386535363139 Jan 24 00:45:45.311000 audit: BPF prog-id=163 op=UNLOAD Jan 24 00:45:45.311000 audit[3508]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3297 pid=3508 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:45.311000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3733386432396564316538396133323462646235326664386535363139 Jan 24 00:45:45.311000 audit: BPF prog-id=165 op=LOAD Jan 24 00:45:45.311000 audit[3508]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b06e8 a2=98 a3=0 items=0 ppid=3297 pid=3508 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:45.311000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3733386432396564316538396133323462646235326664386535363139 Jan 24 00:45:45.347461 containerd[1682]: time="2026-01-24T00:45:45.347348990Z" level=info msg="StartContainer for \"738d29ed1e89a324bdb52fd8e5619a3acea6c553c143938ec62480d9ff877f28\" returns successfully" Jan 24 00:45:45.620702 kubelet[2863]: E0124 00:45:45.620326 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:45:45.724125 kubelet[2863]: I0124 00:45:45.723910 2863 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-5765fff744-8kdwh" podStartSLOduration=1.520964121 podStartE2EDuration="4.723898217s" podCreationTimestamp="2026-01-24 00:45:41 +0000 UTC" firstStartedPulling="2026-01-24 00:45:42.045996056 +0000 UTC m=+18.577982883" lastFinishedPulling="2026-01-24 00:45:45.248930152 +0000 UTC m=+21.780916979" observedRunningTime="2026-01-24 00:45:45.722744769 +0000 UTC m=+22.254731586" watchObservedRunningTime="2026-01-24 00:45:45.723898217 +0000 UTC m=+22.255885044" Jan 24 00:45:45.757254 kubelet[2863]: E0124 00:45:45.757212 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:45.757254 kubelet[2863]: W0124 00:45:45.757251 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:45.757581 kubelet[2863]: E0124 00:45:45.757281 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:45.758430 kubelet[2863]: E0124 00:45:45.758408 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:45.758484 kubelet[2863]: W0124 00:45:45.758431 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:45.758484 kubelet[2863]: E0124 00:45:45.758453 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:45.758955 kubelet[2863]: E0124 00:45:45.758931 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:45.758955 kubelet[2863]: W0124 00:45:45.758951 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:45.759108 kubelet[2863]: E0124 00:45:45.758970 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:45.760144 kubelet[2863]: E0124 00:45:45.760116 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:45.760144 kubelet[2863]: W0124 00:45:45.760140 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:45.760726 kubelet[2863]: E0124 00:45:45.760161 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:45.760992 kubelet[2863]: E0124 00:45:45.760963 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:45.761245 kubelet[2863]: W0124 00:45:45.761124 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:45.761245 kubelet[2863]: E0124 00:45:45.761170 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:45.762199 kubelet[2863]: E0124 00:45:45.762140 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:45.762199 kubelet[2863]: W0124 00:45:45.762167 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:45.762199 kubelet[2863]: E0124 00:45:45.762185 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:45.762770 kubelet[2863]: E0124 00:45:45.762701 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:45.762770 kubelet[2863]: W0124 00:45:45.762729 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:45.762770 kubelet[2863]: E0124 00:45:45.762749 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:45.763743 kubelet[2863]: E0124 00:45:45.763706 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:45.763743 kubelet[2863]: W0124 00:45:45.763727 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:45.763974 kubelet[2863]: E0124 00:45:45.763816 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:45.764313 kubelet[2863]: E0124 00:45:45.764290 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:45.764313 kubelet[2863]: W0124 00:45:45.764309 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:45.764421 kubelet[2863]: E0124 00:45:45.764325 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:45.764789 kubelet[2863]: E0124 00:45:45.764754 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:45.764849 kubelet[2863]: W0124 00:45:45.764806 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:45.764849 kubelet[2863]: E0124 00:45:45.764820 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:45.765781 kubelet[2863]: E0124 00:45:45.765758 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:45.765781 kubelet[2863]: W0124 00:45:45.765777 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:45.766228 kubelet[2863]: E0124 00:45:45.765792 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:45.766228 kubelet[2863]: E0124 00:45:45.766212 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:45.766228 kubelet[2863]: W0124 00:45:45.766228 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:45.766336 kubelet[2863]: E0124 00:45:45.766246 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:45.766664 kubelet[2863]: E0124 00:45:45.766578 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:45.766664 kubelet[2863]: W0124 00:45:45.766593 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:45.766664 kubelet[2863]: E0124 00:45:45.766606 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:45.767001 kubelet[2863]: E0124 00:45:45.766926 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:45.767001 kubelet[2863]: W0124 00:45:45.766942 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:45.767001 kubelet[2863]: E0124 00:45:45.766957 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:45.767475 kubelet[2863]: E0124 00:45:45.767329 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:45.767475 kubelet[2863]: W0124 00:45:45.767351 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:45.767475 kubelet[2863]: E0124 00:45:45.767364 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:45.794265 kubelet[2863]: E0124 00:45:45.794202 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:45.794265 kubelet[2863]: W0124 00:45:45.794227 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:45.794265 kubelet[2863]: E0124 00:45:45.794246 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:45.795293 kubelet[2863]: E0124 00:45:45.795264 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:45.795293 kubelet[2863]: W0124 00:45:45.795285 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:45.795616 kubelet[2863]: E0124 00:45:45.795302 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:45.796547 kubelet[2863]: E0124 00:45:45.796513 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:45.796547 kubelet[2863]: W0124 00:45:45.796535 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:45.796547 kubelet[2863]: E0124 00:45:45.796551 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:45.796969 kubelet[2863]: E0124 00:45:45.796926 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:45.796969 kubelet[2863]: W0124 00:45:45.796943 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:45.796969 kubelet[2863]: E0124 00:45:45.796956 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:45.797627 kubelet[2863]: E0124 00:45:45.797607 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:45.797680 kubelet[2863]: W0124 00:45:45.797627 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:45.797680 kubelet[2863]: E0124 00:45:45.797641 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:45.798125 kubelet[2863]: E0124 00:45:45.798055 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:45.798125 kubelet[2863]: W0124 00:45:45.798107 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:45.798125 kubelet[2863]: E0124 00:45:45.798121 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:45.798557 kubelet[2863]: E0124 00:45:45.798458 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:45.798557 kubelet[2863]: W0124 00:45:45.798479 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:45.798557 kubelet[2863]: E0124 00:45:45.798508 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:45.799183 kubelet[2863]: E0124 00:45:45.799131 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:45.799183 kubelet[2863]: W0124 00:45:45.799175 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:45.799473 kubelet[2863]: E0124 00:45:45.799189 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:45.800041 kubelet[2863]: E0124 00:45:45.799990 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:45.800041 kubelet[2863]: W0124 00:45:45.800011 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:45.800041 kubelet[2863]: E0124 00:45:45.800024 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:45.801035 kubelet[2863]: E0124 00:45:45.801002 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:45.801035 kubelet[2863]: W0124 00:45:45.801026 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:45.801768 kubelet[2863]: E0124 00:45:45.801040 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:45.801768 kubelet[2863]: E0124 00:45:45.801406 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:45.801768 kubelet[2863]: W0124 00:45:45.801419 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:45.801768 kubelet[2863]: E0124 00:45:45.801432 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:45.801768 kubelet[2863]: E0124 00:45:45.801765 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:45.802116 kubelet[2863]: W0124 00:45:45.801777 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:45.802116 kubelet[2863]: E0124 00:45:45.801789 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:45.802955 kubelet[2863]: E0124 00:45:45.802170 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:45.802955 kubelet[2863]: W0124 00:45:45.802182 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:45.802955 kubelet[2863]: E0124 00:45:45.802194 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:45.802955 kubelet[2863]: E0124 00:45:45.802603 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:45.802955 kubelet[2863]: W0124 00:45:45.802615 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:45.802955 kubelet[2863]: E0124 00:45:45.802627 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:45.803293 kubelet[2863]: E0124 00:45:45.803219 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:45.803293 kubelet[2863]: W0124 00:45:45.803233 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:45.803293 kubelet[2863]: E0124 00:45:45.803246 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:45.803715 kubelet[2863]: E0124 00:45:45.803687 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:45.803715 kubelet[2863]: W0124 00:45:45.803708 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:45.803715 kubelet[2863]: E0124 00:45:45.803721 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:45.804302 kubelet[2863]: E0124 00:45:45.804149 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:45.804302 kubelet[2863]: W0124 00:45:45.804168 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:45.804302 kubelet[2863]: E0124 00:45:45.804180 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:45.804929 kubelet[2863]: E0124 00:45:45.804869 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:45.805145 kubelet[2863]: W0124 00:45:45.804938 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:45.805145 kubelet[2863]: E0124 00:45:45.804952 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:46.712518 kubelet[2863]: I0124 00:45:46.712449 2863 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 24 00:45:46.774133 kubelet[2863]: E0124 00:45:46.774014 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:46.774133 kubelet[2863]: W0124 00:45:46.774038 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:46.774133 kubelet[2863]: E0124 00:45:46.774093 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:46.774479 kubelet[2863]: E0124 00:45:46.774439 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:46.774479 kubelet[2863]: W0124 00:45:46.774455 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:46.774479 kubelet[2863]: E0124 00:45:46.774468 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:46.774855 kubelet[2863]: E0124 00:45:46.774806 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:46.774855 kubelet[2863]: W0124 00:45:46.774828 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:46.774855 kubelet[2863]: E0124 00:45:46.774846 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:46.775219 kubelet[2863]: E0124 00:45:46.775171 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:46.775219 kubelet[2863]: W0124 00:45:46.775186 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:46.775219 kubelet[2863]: E0124 00:45:46.775198 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:46.775642 kubelet[2863]: E0124 00:45:46.775600 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:46.775642 kubelet[2863]: W0124 00:45:46.775629 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:46.775852 kubelet[2863]: E0124 00:45:46.775653 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:46.776182 kubelet[2863]: E0124 00:45:46.776145 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:46.776182 kubelet[2863]: W0124 00:45:46.776166 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:46.776182 kubelet[2863]: E0124 00:45:46.776180 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:46.776489 kubelet[2863]: E0124 00:45:46.776466 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:46.776489 kubelet[2863]: W0124 00:45:46.776480 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:46.776600 kubelet[2863]: E0124 00:45:46.776491 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:46.776944 kubelet[2863]: E0124 00:45:46.776854 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:46.776944 kubelet[2863]: W0124 00:45:46.776879 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:46.776944 kubelet[2863]: E0124 00:45:46.776898 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:46.777384 kubelet[2863]: E0124 00:45:46.777348 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:46.777384 kubelet[2863]: W0124 00:45:46.777367 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:46.777384 kubelet[2863]: E0124 00:45:46.777381 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:46.777831 kubelet[2863]: E0124 00:45:46.777780 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:46.777831 kubelet[2863]: W0124 00:45:46.777803 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:46.777831 kubelet[2863]: E0124 00:45:46.777820 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:46.778320 kubelet[2863]: E0124 00:45:46.778268 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:46.778320 kubelet[2863]: W0124 00:45:46.778287 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:46.778320 kubelet[2863]: E0124 00:45:46.778302 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:46.778718 kubelet[2863]: E0124 00:45:46.778668 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:46.778718 kubelet[2863]: W0124 00:45:46.778691 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:46.778718 kubelet[2863]: E0124 00:45:46.778708 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:46.779279 kubelet[2863]: E0124 00:45:46.779133 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:46.779279 kubelet[2863]: W0124 00:45:46.779150 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:46.779279 kubelet[2863]: E0124 00:45:46.779162 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:46.779587 kubelet[2863]: E0124 00:45:46.779572 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:46.779708 kubelet[2863]: W0124 00:45:46.779667 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:46.779708 kubelet[2863]: E0124 00:45:46.779685 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:46.779985 kubelet[2863]: E0124 00:45:46.779970 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:46.779985 kubelet[2863]: W0124 00:45:46.779983 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:46.780113 kubelet[2863]: E0124 00:45:46.779995 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:46.805821 kubelet[2863]: E0124 00:45:46.805742 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:46.805821 kubelet[2863]: W0124 00:45:46.805801 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:46.805821 kubelet[2863]: E0124 00:45:46.805821 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:46.806531 kubelet[2863]: E0124 00:45:46.806449 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:46.806612 kubelet[2863]: W0124 00:45:46.806542 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:46.806612 kubelet[2863]: E0124 00:45:46.806572 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:46.807171 kubelet[2863]: E0124 00:45:46.807142 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:46.807253 kubelet[2863]: W0124 00:45:46.807197 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:46.807253 kubelet[2863]: E0124 00:45:46.807210 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:46.808120 kubelet[2863]: E0124 00:45:46.807906 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:46.808120 kubelet[2863]: W0124 00:45:46.807940 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:46.808120 kubelet[2863]: E0124 00:45:46.807978 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:46.808683 kubelet[2863]: E0124 00:45:46.808646 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:46.808775 kubelet[2863]: W0124 00:45:46.808708 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:46.808775 kubelet[2863]: E0124 00:45:46.808723 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:46.809351 kubelet[2863]: E0124 00:45:46.809314 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:46.809351 kubelet[2863]: W0124 00:45:46.809333 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:46.809533 kubelet[2863]: E0124 00:45:46.809387 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:46.810287 kubelet[2863]: E0124 00:45:46.809894 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:46.810287 kubelet[2863]: W0124 00:45:46.809937 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:46.810287 kubelet[2863]: E0124 00:45:46.809949 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:46.810545 kubelet[2863]: E0124 00:45:46.810438 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:46.810545 kubelet[2863]: W0124 00:45:46.810450 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:46.810545 kubelet[2863]: E0124 00:45:46.810462 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:46.811034 kubelet[2863]: E0124 00:45:46.811002 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:46.811034 kubelet[2863]: W0124 00:45:46.811022 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:46.811034 kubelet[2863]: E0124 00:45:46.811034 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:46.812791 kubelet[2863]: E0124 00:45:46.811580 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:46.812791 kubelet[2863]: W0124 00:45:46.811642 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:46.812791 kubelet[2863]: E0124 00:45:46.811655 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:46.812791 kubelet[2863]: E0124 00:45:46.812124 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:46.812791 kubelet[2863]: W0124 00:45:46.812163 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:46.812791 kubelet[2863]: E0124 00:45:46.812176 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:46.812791 kubelet[2863]: E0124 00:45:46.812739 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:46.812791 kubelet[2863]: W0124 00:45:46.812751 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:46.812791 kubelet[2863]: E0124 00:45:46.812764 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:46.814254 kubelet[2863]: E0124 00:45:46.813486 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:46.814254 kubelet[2863]: W0124 00:45:46.813563 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:46.814254 kubelet[2863]: E0124 00:45:46.813575 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:46.814254 kubelet[2863]: E0124 00:45:46.814160 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:46.814254 kubelet[2863]: W0124 00:45:46.814173 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:46.814254 kubelet[2863]: E0124 00:45:46.814186 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:46.815254 kubelet[2863]: E0124 00:45:46.815098 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:46.815254 kubelet[2863]: W0124 00:45:46.815113 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:46.815254 kubelet[2863]: E0124 00:45:46.815127 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:46.815728 kubelet[2863]: E0124 00:45:46.815692 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:46.815728 kubelet[2863]: W0124 00:45:46.815714 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:46.815832 kubelet[2863]: E0124 00:45:46.815727 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:46.816883 kubelet[2863]: E0124 00:45:46.816407 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:46.816883 kubelet[2863]: W0124 00:45:46.816584 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:46.816883 kubelet[2863]: E0124 00:45:46.816605 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:46.817121 kubelet[2863]: E0124 00:45:46.817109 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 24 00:45:46.817176 kubelet[2863]: W0124 00:45:46.817135 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 24 00:45:46.817176 kubelet[2863]: E0124 00:45:46.817148 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 24 00:45:47.143715 containerd[1682]: time="2026-01-24T00:45:47.143631534Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 24 00:45:47.145338 containerd[1682]: time="2026-01-24T00:45:47.145082373Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4: active requests=0, bytes read=0" Jan 24 00:45:47.146515 containerd[1682]: time="2026-01-24T00:45:47.146465510Z" level=info msg="ImageCreate event name:\"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 24 00:45:47.149026 containerd[1682]: time="2026-01-24T00:45:47.148982166Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 24 00:45:47.149893 containerd[1682]: time="2026-01-24T00:45:47.149836514Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" with image id \"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\", size \"5941314\" in 1.900075324s" Jan 24 00:45:47.150001 containerd[1682]: time="2026-01-24T00:45:47.149981255Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" returns image reference \"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\"" Jan 24 00:45:47.155998 containerd[1682]: time="2026-01-24T00:45:47.155959206Z" level=info msg="CreateContainer within sandbox \"006723ff522a1ec89aa42472e96a2a108256b3c2490e6db62e86e8b1e883a13e\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Jan 24 00:45:47.169508 containerd[1682]: time="2026-01-24T00:45:47.169225106Z" level=info msg="Container b9006ec4314553ce5a7708a7bcc95caaa40a79266a6e5d8e801c20955c360c87: CDI devices from CRI Config.CDIDevices: []" Jan 24 00:45:47.178658 containerd[1682]: time="2026-01-24T00:45:47.178639711Z" level=info msg="CreateContainer within sandbox \"006723ff522a1ec89aa42472e96a2a108256b3c2490e6db62e86e8b1e883a13e\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"b9006ec4314553ce5a7708a7bcc95caaa40a79266a6e5d8e801c20955c360c87\"" Jan 24 00:45:47.179533 containerd[1682]: time="2026-01-24T00:45:47.179421540Z" level=info msg="StartContainer for \"b9006ec4314553ce5a7708a7bcc95caaa40a79266a6e5d8e801c20955c360c87\"" Jan 24 00:45:47.181532 containerd[1682]: time="2026-01-24T00:45:47.181465807Z" level=info msg="connecting to shim b9006ec4314553ce5a7708a7bcc95caaa40a79266a6e5d8e801c20955c360c87" address="unix:///run/containerd/s/8e909fb5a93aa971aa66fc790686e79852f192cb3166895f4fc8c6f8bd071b5f" protocol=ttrpc version=3 Jan 24 00:45:47.208327 systemd[1]: Started cri-containerd-b9006ec4314553ce5a7708a7bcc95caaa40a79266a6e5d8e801c20955c360c87.scope - libcontainer container b9006ec4314553ce5a7708a7bcc95caaa40a79266a6e5d8e801c20955c360c87. Jan 24 00:45:47.276000 audit: BPF prog-id=166 op=LOAD Jan 24 00:45:47.276000 audit[3615]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001b0488 a2=98 a3=0 items=0 ppid=3412 pid=3615 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:47.276000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6239303036656334333134353533636535613737303861376263633935 Jan 24 00:45:47.276000 audit: BPF prog-id=167 op=LOAD Jan 24 00:45:47.276000 audit[3615]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c0001b0218 a2=98 a3=0 items=0 ppid=3412 pid=3615 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:47.276000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6239303036656334333134353533636535613737303861376263633935 Jan 24 00:45:47.276000 audit: BPF prog-id=167 op=UNLOAD Jan 24 00:45:47.276000 audit[3615]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3412 pid=3615 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:47.276000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6239303036656334333134353533636535613737303861376263633935 Jan 24 00:45:47.276000 audit: BPF prog-id=166 op=UNLOAD Jan 24 00:45:47.276000 audit[3615]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3412 pid=3615 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:47.276000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6239303036656334333134353533636535613737303861376263633935 Jan 24 00:45:47.276000 audit: BPF prog-id=168 op=LOAD Jan 24 00:45:47.276000 audit[3615]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001b06e8 a2=98 a3=0 items=0 ppid=3412 pid=3615 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:47.276000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6239303036656334333134353533636535613737303861376263633935 Jan 24 00:45:47.316878 containerd[1682]: time="2026-01-24T00:45:47.316512361Z" level=info msg="StartContainer for \"b9006ec4314553ce5a7708a7bcc95caaa40a79266a6e5d8e801c20955c360c87\" returns successfully" Jan 24 00:45:47.338832 systemd[1]: cri-containerd-b9006ec4314553ce5a7708a7bcc95caaa40a79266a6e5d8e801c20955c360c87.scope: Deactivated successfully. Jan 24 00:45:47.341000 audit: BPF prog-id=168 op=UNLOAD Jan 24 00:45:47.343485 containerd[1682]: time="2026-01-24T00:45:47.343416771Z" level=info msg="received container exit event container_id:\"b9006ec4314553ce5a7708a7bcc95caaa40a79266a6e5d8e801c20955c360c87\" id:\"b9006ec4314553ce5a7708a7bcc95caaa40a79266a6e5d8e801c20955c360c87\" pid:3627 exited_at:{seconds:1769215547 nanos:342633432}" Jan 24 00:45:47.380307 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-b9006ec4314553ce5a7708a7bcc95caaa40a79266a6e5d8e801c20955c360c87-rootfs.mount: Deactivated successfully. Jan 24 00:45:47.619451 kubelet[2863]: E0124 00:45:47.619356 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:45:47.721832 containerd[1682]: time="2026-01-24T00:45:47.721722537Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\"" Jan 24 00:45:49.616445 kubelet[2863]: E0124 00:45:49.615805 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:45:51.614663 kubelet[2863]: E0124 00:45:51.614622 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:45:51.871767 containerd[1682]: time="2026-01-24T00:45:51.871659315Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 24 00:45:51.872919 containerd[1682]: time="2026-01-24T00:45:51.872795724Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.4: active requests=0, bytes read=70442291" Jan 24 00:45:51.873823 containerd[1682]: time="2026-01-24T00:45:51.873789153Z" level=info msg="ImageCreate event name:\"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 24 00:45:51.875623 containerd[1682]: time="2026-01-24T00:45:51.875596981Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 24 00:45:51.876080 containerd[1682]: time="2026-01-24T00:45:51.876034781Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.4\" with image id \"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\", size \"71941459\" in 4.154262184s" Jan 24 00:45:51.876144 containerd[1682]: time="2026-01-24T00:45:51.876131361Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\" returns image reference \"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\"" Jan 24 00:45:51.880559 containerd[1682]: time="2026-01-24T00:45:51.880507466Z" level=info msg="CreateContainer within sandbox \"006723ff522a1ec89aa42472e96a2a108256b3c2490e6db62e86e8b1e883a13e\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Jan 24 00:45:51.891434 containerd[1682]: time="2026-01-24T00:45:51.891393386Z" level=info msg="Container bbca05c2e66c1ee017f51cb6b4f7d3ad14e4d7573c28bbd3bc34c7fe6f355fa4: CDI devices from CRI Config.CDIDevices: []" Jan 24 00:45:51.908017 containerd[1682]: time="2026-01-24T00:45:51.907981679Z" level=info msg="CreateContainer within sandbox \"006723ff522a1ec89aa42472e96a2a108256b3c2490e6db62e86e8b1e883a13e\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"bbca05c2e66c1ee017f51cb6b4f7d3ad14e4d7573c28bbd3bc34c7fe6f355fa4\"" Jan 24 00:45:51.908608 containerd[1682]: time="2026-01-24T00:45:51.908520789Z" level=info msg="StartContainer for \"bbca05c2e66c1ee017f51cb6b4f7d3ad14e4d7573c28bbd3bc34c7fe6f355fa4\"" Jan 24 00:45:51.909866 containerd[1682]: time="2026-01-24T00:45:51.909670618Z" level=info msg="connecting to shim bbca05c2e66c1ee017f51cb6b4f7d3ad14e4d7573c28bbd3bc34c7fe6f355fa4" address="unix:///run/containerd/s/8e909fb5a93aa971aa66fc790686e79852f192cb3166895f4fc8c6f8bd071b5f" protocol=ttrpc version=3 Jan 24 00:45:51.932210 systemd[1]: Started cri-containerd-bbca05c2e66c1ee017f51cb6b4f7d3ad14e4d7573c28bbd3bc34c7fe6f355fa4.scope - libcontainer container bbca05c2e66c1ee017f51cb6b4f7d3ad14e4d7573c28bbd3bc34c7fe6f355fa4. Jan 24 00:45:51.995000 audit: BPF prog-id=169 op=LOAD Jan 24 00:45:51.997755 kernel: kauditd_printk_skb: 34 callbacks suppressed Jan 24 00:45:51.997831 kernel: audit: type=1334 audit(1769215551.995:568): prog-id=169 op=LOAD Jan 24 00:45:51.995000 audit[3670]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=3412 pid=3670 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:52.026200 kernel: audit: type=1300 audit(1769215551.995:568): arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=3412 pid=3670 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:52.026448 kernel: audit: type=1327 audit(1769215551.995:568): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262636130356332653636633165653031376635316362366234663764 Jan 24 00:45:51.995000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262636130356332653636633165653031376635316362366234663764 Jan 24 00:45:52.031503 kernel: audit: type=1334 audit(1769215551.995:569): prog-id=170 op=LOAD Jan 24 00:45:51.995000 audit: BPF prog-id=170 op=LOAD Jan 24 00:45:52.038049 kernel: audit: type=1300 audit(1769215551.995:569): arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=3412 pid=3670 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:51.995000 audit[3670]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=3412 pid=3670 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:51.995000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262636130356332653636633165653031376635316362366234663764 Jan 24 00:45:52.046833 kernel: audit: type=1327 audit(1769215551.995:569): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262636130356332653636633165653031376635316362366234663764 Jan 24 00:45:52.046876 kernel: audit: type=1334 audit(1769215551.995:570): prog-id=170 op=UNLOAD Jan 24 00:45:51.995000 audit: BPF prog-id=170 op=UNLOAD Jan 24 00:45:52.053151 kernel: audit: type=1300 audit(1769215551.995:570): arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3412 pid=3670 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:51.995000 audit[3670]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3412 pid=3670 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:52.059629 kernel: audit: type=1327 audit(1769215551.995:570): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262636130356332653636633165653031376635316362366234663764 Jan 24 00:45:51.995000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262636130356332653636633165653031376635316362366234663764 Jan 24 00:45:52.061973 kernel: audit: type=1334 audit(1769215551.995:571): prog-id=169 op=UNLOAD Jan 24 00:45:51.995000 audit: BPF prog-id=169 op=UNLOAD Jan 24 00:45:51.995000 audit[3670]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3412 pid=3670 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:51.995000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262636130356332653636633165653031376635316362366234663764 Jan 24 00:45:51.995000 audit: BPF prog-id=171 op=LOAD Jan 24 00:45:51.995000 audit[3670]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=3412 pid=3670 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:45:51.995000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262636130356332653636633165653031376635316362366234663764 Jan 24 00:45:52.094084 containerd[1682]: time="2026-01-24T00:45:52.093771987Z" level=info msg="StartContainer for \"bbca05c2e66c1ee017f51cb6b4f7d3ad14e4d7573c28bbd3bc34c7fe6f355fa4\" returns successfully" Jan 24 00:45:52.688836 containerd[1682]: time="2026-01-24T00:45:52.688797540Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 24 00:45:52.692268 systemd[1]: cri-containerd-bbca05c2e66c1ee017f51cb6b4f7d3ad14e4d7573c28bbd3bc34c7fe6f355fa4.scope: Deactivated successfully. Jan 24 00:45:52.693149 systemd[1]: cri-containerd-bbca05c2e66c1ee017f51cb6b4f7d3ad14e4d7573c28bbd3bc34c7fe6f355fa4.scope: Consumed 620ms CPU time, 194.3M memory peak, 171.3M written to disk. Jan 24 00:45:52.694335 containerd[1682]: time="2026-01-24T00:45:52.694249004Z" level=info msg="received container exit event container_id:\"bbca05c2e66c1ee017f51cb6b4f7d3ad14e4d7573c28bbd3bc34c7fe6f355fa4\" id:\"bbca05c2e66c1ee017f51cb6b4f7d3ad14e4d7573c28bbd3bc34c7fe6f355fa4\" pid:3683 exited_at:{seconds:1769215552 nanos:693835475}" Jan 24 00:45:52.694000 audit: BPF prog-id=171 op=UNLOAD Jan 24 00:45:52.723545 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-bbca05c2e66c1ee017f51cb6b4f7d3ad14e4d7573c28bbd3bc34c7fe6f355fa4-rootfs.mount: Deactivated successfully. Jan 24 00:45:52.743824 kubelet[2863]: I0124 00:45:52.743767 2863 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Jan 24 00:45:52.810434 systemd[1]: Created slice kubepods-besteffort-podd0cc0ca8_4b85_478b_b9c2_c61b42d93c89.slice - libcontainer container kubepods-besteffort-podd0cc0ca8_4b85_478b_b9c2_c61b42d93c89.slice. Jan 24 00:45:52.848463 systemd[1]: Created slice kubepods-besteffort-podd83d59e3_6296_40d9_bb63_5a69b654ac0c.slice - libcontainer container kubepods-besteffort-podd83d59e3_6296_40d9_bb63_5a69b654ac0c.slice. Jan 24 00:45:52.851022 kubelet[2863]: I0124 00:45:52.850733 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/d0cc0ca8-4b85-478b-b9c2-c61b42d93c89-goldmane-key-pair\") pod \"goldmane-666569f655-qhldj\" (UID: \"d0cc0ca8-4b85-478b-b9c2-c61b42d93c89\") " pod="calico-system/goldmane-666569f655-qhldj" Jan 24 00:45:52.851022 kubelet[2863]: I0124 00:45:52.850762 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7bh8\" (UniqueName: \"kubernetes.io/projected/d0cc0ca8-4b85-478b-b9c2-c61b42d93c89-kube-api-access-w7bh8\") pod \"goldmane-666569f655-qhldj\" (UID: \"d0cc0ca8-4b85-478b-b9c2-c61b42d93c89\") " pod="calico-system/goldmane-666569f655-qhldj" Jan 24 00:45:52.851022 kubelet[2863]: I0124 00:45:52.850780 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tm5dq\" (UniqueName: \"kubernetes.io/projected/d83d59e3-6296-40d9-bb63-5a69b654ac0c-kube-api-access-tm5dq\") pod \"calico-apiserver-7779cd58b4-xp2rb\" (UID: \"d83d59e3-6296-40d9-bb63-5a69b654ac0c\") " pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" Jan 24 00:45:52.851022 kubelet[2863]: I0124 00:45:52.850792 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/4461b4fb-8066-4a43-ad64-18be27337144-whisker-backend-key-pair\") pod \"whisker-6bfddbcdd-k5z9m\" (UID: \"4461b4fb-8066-4a43-ad64-18be27337144\") " pod="calico-system/whisker-6bfddbcdd-k5z9m" Jan 24 00:45:52.851022 kubelet[2863]: I0124 00:45:52.850804 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4461b4fb-8066-4a43-ad64-18be27337144-whisker-ca-bundle\") pod \"whisker-6bfddbcdd-k5z9m\" (UID: \"4461b4fb-8066-4a43-ad64-18be27337144\") " pod="calico-system/whisker-6bfddbcdd-k5z9m" Jan 24 00:45:52.851312 kubelet[2863]: I0124 00:45:52.850814 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ng2x\" (UniqueName: \"kubernetes.io/projected/4461b4fb-8066-4a43-ad64-18be27337144-kube-api-access-7ng2x\") pod \"whisker-6bfddbcdd-k5z9m\" (UID: \"4461b4fb-8066-4a43-ad64-18be27337144\") " pod="calico-system/whisker-6bfddbcdd-k5z9m" Jan 24 00:45:52.851312 kubelet[2863]: I0124 00:45:52.850826 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnrvx\" (UniqueName: \"kubernetes.io/projected/7f18d0f4-a1ee-4594-a309-04f492279d0c-kube-api-access-hnrvx\") pod \"coredns-674b8bbfcf-x8d2s\" (UID: \"7f18d0f4-a1ee-4594-a309-04f492279d0c\") " pod="kube-system/coredns-674b8bbfcf-x8d2s" Jan 24 00:45:52.851312 kubelet[2863]: I0124 00:45:52.850837 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/7fb26181-9fdc-4f96-be2c-85fbaa5f21b7-calico-apiserver-certs\") pod \"calico-apiserver-7779cd58b4-jsxfd\" (UID: \"7fb26181-9fdc-4f96-be2c-85fbaa5f21b7\") " pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" Jan 24 00:45:52.851312 kubelet[2863]: I0124 00:45:52.850856 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0cc0ca8-4b85-478b-b9c2-c61b42d93c89-goldmane-ca-bundle\") pod \"goldmane-666569f655-qhldj\" (UID: \"d0cc0ca8-4b85-478b-b9c2-c61b42d93c89\") " pod="calico-system/goldmane-666569f655-qhldj" Jan 24 00:45:52.851312 kubelet[2863]: I0124 00:45:52.850884 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7f18d0f4-a1ee-4594-a309-04f492279d0c-config-volume\") pod \"coredns-674b8bbfcf-x8d2s\" (UID: \"7f18d0f4-a1ee-4594-a309-04f492279d0c\") " pod="kube-system/coredns-674b8bbfcf-x8d2s" Jan 24 00:45:52.851504 kubelet[2863]: I0124 00:45:52.850896 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llmb2\" (UniqueName: \"kubernetes.io/projected/7fb26181-9fdc-4f96-be2c-85fbaa5f21b7-kube-api-access-llmb2\") pod \"calico-apiserver-7779cd58b4-jsxfd\" (UID: \"7fb26181-9fdc-4f96-be2c-85fbaa5f21b7\") " pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" Jan 24 00:45:52.851504 kubelet[2863]: I0124 00:45:52.850908 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0cc0ca8-4b85-478b-b9c2-c61b42d93c89-config\") pod \"goldmane-666569f655-qhldj\" (UID: \"d0cc0ca8-4b85-478b-b9c2-c61b42d93c89\") " pod="calico-system/goldmane-666569f655-qhldj" Jan 24 00:45:52.851504 kubelet[2863]: I0124 00:45:52.850921 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/d83d59e3-6296-40d9-bb63-5a69b654ac0c-calico-apiserver-certs\") pod \"calico-apiserver-7779cd58b4-xp2rb\" (UID: \"d83d59e3-6296-40d9-bb63-5a69b654ac0c\") " pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" Jan 24 00:45:52.855802 systemd[1]: Created slice kubepods-besteffort-pod4461b4fb_8066_4a43_ad64_18be27337144.slice - libcontainer container kubepods-besteffort-pod4461b4fb_8066_4a43_ad64_18be27337144.slice. Jan 24 00:45:52.862919 systemd[1]: Created slice kubepods-besteffort-pod7fb26181_9fdc_4f96_be2c_85fbaa5f21b7.slice - libcontainer container kubepods-besteffort-pod7fb26181_9fdc_4f96_be2c_85fbaa5f21b7.slice. Jan 24 00:45:52.871252 systemd[1]: Created slice kubepods-burstable-pod7f18d0f4_a1ee_4594_a309_04f492279d0c.slice - libcontainer container kubepods-burstable-pod7f18d0f4_a1ee_4594_a309_04f492279d0c.slice. Jan 24 00:45:52.880234 systemd[1]: Created slice kubepods-besteffort-podf546e732_cf0b_44c7_9678_ae1cb31a23a4.slice - libcontainer container kubepods-besteffort-podf546e732_cf0b_44c7_9678_ae1cb31a23a4.slice. Jan 24 00:45:52.885228 systemd[1]: Created slice kubepods-burstable-pod0a9fe640_bc48_43ab_ab91_0adb30a0cc77.slice - libcontainer container kubepods-burstable-pod0a9fe640_bc48_43ab_ab91_0adb30a0cc77.slice. Jan 24 00:45:52.952217 kubelet[2863]: I0124 00:45:52.951711 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f546e732-cf0b-44c7-9678-ae1cb31a23a4-tigera-ca-bundle\") pod \"calico-kube-controllers-58fdcd774c-w2drb\" (UID: \"f546e732-cf0b-44c7-9678-ae1cb31a23a4\") " pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" Jan 24 00:45:52.952217 kubelet[2863]: I0124 00:45:52.951894 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwv5q\" (UniqueName: \"kubernetes.io/projected/0a9fe640-bc48-43ab-ab91-0adb30a0cc77-kube-api-access-dwv5q\") pod \"coredns-674b8bbfcf-dfsrt\" (UID: \"0a9fe640-bc48-43ab-ab91-0adb30a0cc77\") " pod="kube-system/coredns-674b8bbfcf-dfsrt" Jan 24 00:45:52.954563 kubelet[2863]: I0124 00:45:52.954451 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0a9fe640-bc48-43ab-ab91-0adb30a0cc77-config-volume\") pod \"coredns-674b8bbfcf-dfsrt\" (UID: \"0a9fe640-bc48-43ab-ab91-0adb30a0cc77\") " pod="kube-system/coredns-674b8bbfcf-dfsrt" Jan 24 00:45:52.954671 kubelet[2863]: I0124 00:45:52.954639 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5fsn\" (UniqueName: \"kubernetes.io/projected/f546e732-cf0b-44c7-9678-ae1cb31a23a4-kube-api-access-f5fsn\") pod \"calico-kube-controllers-58fdcd774c-w2drb\" (UID: \"f546e732-cf0b-44c7-9678-ae1cb31a23a4\") " pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" Jan 24 00:45:53.129369 containerd[1682]: time="2026-01-24T00:45:53.129170590Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-qhldj,Uid:d0cc0ca8-4b85-478b-b9c2-c61b42d93c89,Namespace:calico-system,Attempt:0,}" Jan 24 00:45:53.153029 containerd[1682]: time="2026-01-24T00:45:53.152938961Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7779cd58b4-xp2rb,Uid:d83d59e3-6296-40d9-bb63-5a69b654ac0c,Namespace:calico-apiserver,Attempt:0,}" Jan 24 00:45:53.162052 containerd[1682]: time="2026-01-24T00:45:53.161959715Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6bfddbcdd-k5z9m,Uid:4461b4fb-8066-4a43-ad64-18be27337144,Namespace:calico-system,Attempt:0,}" Jan 24 00:45:53.176202 containerd[1682]: time="2026-01-24T00:45:53.176087914Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7779cd58b4-jsxfd,Uid:7fb26181-9fdc-4f96-be2c-85fbaa5f21b7,Namespace:calico-apiserver,Attempt:0,}" Jan 24 00:45:53.179756 containerd[1682]: time="2026-01-24T00:45:53.179714601Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-x8d2s,Uid:7f18d0f4-a1ee-4594-a309-04f492279d0c,Namespace:kube-system,Attempt:0,}" Jan 24 00:45:53.187346 containerd[1682]: time="2026-01-24T00:45:53.187221066Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-58fdcd774c-w2drb,Uid:f546e732-cf0b-44c7-9678-ae1cb31a23a4,Namespace:calico-system,Attempt:0,}" Jan 24 00:45:53.189779 containerd[1682]: time="2026-01-24T00:45:53.189592184Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-dfsrt,Uid:0a9fe640-bc48-43ab-ab91-0adb30a0cc77,Namespace:kube-system,Attempt:0,}" Jan 24 00:45:53.304473 containerd[1682]: time="2026-01-24T00:45:53.304138686Z" level=error msg="Failed to destroy network for sandbox \"cb5ecb66a024ae14fec997fc2dcefa313eee5c41b3da9dc7e57358ca1aa8ce34\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 24 00:45:53.308497 containerd[1682]: time="2026-01-24T00:45:53.308450223Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-qhldj,Uid:d0cc0ca8-4b85-478b-b9c2-c61b42d93c89,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"cb5ecb66a024ae14fec997fc2dcefa313eee5c41b3da9dc7e57358ca1aa8ce34\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 24 00:45:53.309638 kubelet[2863]: E0124 00:45:53.308760 2863 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cb5ecb66a024ae14fec997fc2dcefa313eee5c41b3da9dc7e57358ca1aa8ce34\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 24 00:45:53.309638 kubelet[2863]: E0124 00:45:53.308812 2863 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cb5ecb66a024ae14fec997fc2dcefa313eee5c41b3da9dc7e57358ca1aa8ce34\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-qhldj" Jan 24 00:45:53.309638 kubelet[2863]: E0124 00:45:53.308830 2863 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cb5ecb66a024ae14fec997fc2dcefa313eee5c41b3da9dc7e57358ca1aa8ce34\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-qhldj" Jan 24 00:45:53.309722 kubelet[2863]: E0124 00:45:53.308868 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-666569f655-qhldj_calico-system(d0cc0ca8-4b85-478b-b9c2-c61b42d93c89)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-666569f655-qhldj_calico-system(d0cc0ca8-4b85-478b-b9c2-c61b42d93c89)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"cb5ecb66a024ae14fec997fc2dcefa313eee5c41b3da9dc7e57358ca1aa8ce34\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-666569f655-qhldj" podUID="d0cc0ca8-4b85-478b-b9c2-c61b42d93c89" Jan 24 00:45:53.336325 containerd[1682]: time="2026-01-24T00:45:53.336262162Z" level=error msg="Failed to destroy network for sandbox \"44a0e41e94297dfcc7226851a2f2a695f7be58352e55a622aa614473a67489be\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 24 00:45:53.341642 containerd[1682]: time="2026-01-24T00:45:53.341100358Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6bfddbcdd-k5z9m,Uid:4461b4fb-8066-4a43-ad64-18be27337144,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"44a0e41e94297dfcc7226851a2f2a695f7be58352e55a622aa614473a67489be\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 24 00:45:53.341775 kubelet[2863]: E0124 00:45:53.341295 2863 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"44a0e41e94297dfcc7226851a2f2a695f7be58352e55a622aa614473a67489be\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 24 00:45:53.341775 kubelet[2863]: E0124 00:45:53.341342 2863 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"44a0e41e94297dfcc7226851a2f2a695f7be58352e55a622aa614473a67489be\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-6bfddbcdd-k5z9m" Jan 24 00:45:53.341775 kubelet[2863]: E0124 00:45:53.341359 2863 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"44a0e41e94297dfcc7226851a2f2a695f7be58352e55a622aa614473a67489be\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-6bfddbcdd-k5z9m" Jan 24 00:45:53.341855 kubelet[2863]: E0124 00:45:53.341399 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-6bfddbcdd-k5z9m_calico-system(4461b4fb-8066-4a43-ad64-18be27337144)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-6bfddbcdd-k5z9m_calico-system(4461b4fb-8066-4a43-ad64-18be27337144)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"44a0e41e94297dfcc7226851a2f2a695f7be58352e55a622aa614473a67489be\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-6bfddbcdd-k5z9m" podUID="4461b4fb-8066-4a43-ad64-18be27337144" Jan 24 00:45:53.349424 containerd[1682]: time="2026-01-24T00:45:53.349330632Z" level=error msg="Failed to destroy network for sandbox \"217ee0234e19fc07d05afa9c9908ee63ef30f139df70bbde95759a1652d3301e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 24 00:45:53.351767 containerd[1682]: time="2026-01-24T00:45:53.351744350Z" level=error msg="Failed to destroy network for sandbox \"007b5024ebccd276dad51e53f3ad7643924c2820e106cf537033428b956b86d7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 24 00:45:53.353440 containerd[1682]: time="2026-01-24T00:45:53.353415738Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7779cd58b4-jsxfd,Uid:7fb26181-9fdc-4f96-be2c-85fbaa5f21b7,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"217ee0234e19fc07d05afa9c9908ee63ef30f139df70bbde95759a1652d3301e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 24 00:45:53.353877 kubelet[2863]: E0124 00:45:53.353773 2863 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"217ee0234e19fc07d05afa9c9908ee63ef30f139df70bbde95759a1652d3301e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 24 00:45:53.353877 kubelet[2863]: E0124 00:45:53.353818 2863 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"217ee0234e19fc07d05afa9c9908ee63ef30f139df70bbde95759a1652d3301e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" Jan 24 00:45:53.353877 kubelet[2863]: E0124 00:45:53.353846 2863 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"217ee0234e19fc07d05afa9c9908ee63ef30f139df70bbde95759a1652d3301e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" Jan 24 00:45:53.354201 kubelet[2863]: E0124 00:45:53.354047 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7779cd58b4-jsxfd_calico-apiserver(7fb26181-9fdc-4f96-be2c-85fbaa5f21b7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7779cd58b4-jsxfd_calico-apiserver(7fb26181-9fdc-4f96-be2c-85fbaa5f21b7)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"217ee0234e19fc07d05afa9c9908ee63ef30f139df70bbde95759a1652d3301e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" podUID="7fb26181-9fdc-4f96-be2c-85fbaa5f21b7" Jan 24 00:45:53.355871 containerd[1682]: time="2026-01-24T00:45:53.355848147Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7779cd58b4-xp2rb,Uid:d83d59e3-6296-40d9-bb63-5a69b654ac0c,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"007b5024ebccd276dad51e53f3ad7643924c2820e106cf537033428b956b86d7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 24 00:45:53.356114 kubelet[2863]: E0124 00:45:53.356092 2863 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"007b5024ebccd276dad51e53f3ad7643924c2820e106cf537033428b956b86d7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 24 00:45:53.356229 kubelet[2863]: E0124 00:45:53.356209 2863 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"007b5024ebccd276dad51e53f3ad7643924c2820e106cf537033428b956b86d7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" Jan 24 00:45:53.356360 kubelet[2863]: E0124 00:45:53.356274 2863 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"007b5024ebccd276dad51e53f3ad7643924c2820e106cf537033428b956b86d7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" Jan 24 00:45:53.356595 kubelet[2863]: E0124 00:45:53.356313 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7779cd58b4-xp2rb_calico-apiserver(d83d59e3-6296-40d9-bb63-5a69b654ac0c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7779cd58b4-xp2rb_calico-apiserver(d83d59e3-6296-40d9-bb63-5a69b654ac0c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"007b5024ebccd276dad51e53f3ad7643924c2820e106cf537033428b956b86d7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" podUID="d83d59e3-6296-40d9-bb63-5a69b654ac0c" Jan 24 00:45:53.360602 containerd[1682]: time="2026-01-24T00:45:53.360256623Z" level=error msg="Failed to destroy network for sandbox \"2c3827f361a92df946641a636b4ba12503a0149639fdf0953f0fad54e2cd65c3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 24 00:45:53.364422 containerd[1682]: time="2026-01-24T00:45:53.364388731Z" level=error msg="Failed to destroy network for sandbox \"cf614be9bdc970fd802626cd7c8008207780097ef3cf4b616a01dc6fae327e8b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 24 00:45:53.365783 containerd[1682]: time="2026-01-24T00:45:53.365735709Z" level=error msg="Failed to destroy network for sandbox \"94b68cca43728c953742b0d8678d9d1e3711443258fe9ba2311daa1cb16f422b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 24 00:45:53.366564 containerd[1682]: time="2026-01-24T00:45:53.366458728Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-58fdcd774c-w2drb,Uid:f546e732-cf0b-44c7-9678-ae1cb31a23a4,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"cf614be9bdc970fd802626cd7c8008207780097ef3cf4b616a01dc6fae327e8b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 24 00:45:53.366704 kubelet[2863]: E0124 00:45:53.366669 2863 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cf614be9bdc970fd802626cd7c8008207780097ef3cf4b616a01dc6fae327e8b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 24 00:45:53.366751 kubelet[2863]: E0124 00:45:53.366711 2863 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cf614be9bdc970fd802626cd7c8008207780097ef3cf4b616a01dc6fae327e8b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" Jan 24 00:45:53.366751 kubelet[2863]: E0124 00:45:53.366736 2863 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cf614be9bdc970fd802626cd7c8008207780097ef3cf4b616a01dc6fae327e8b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" Jan 24 00:45:53.366853 kubelet[2863]: E0124 00:45:53.366823 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-58fdcd774c-w2drb_calico-system(f546e732-cf0b-44c7-9678-ae1cb31a23a4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-58fdcd774c-w2drb_calico-system(f546e732-cf0b-44c7-9678-ae1cb31a23a4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"cf614be9bdc970fd802626cd7c8008207780097ef3cf4b616a01dc6fae327e8b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:45:53.367363 containerd[1682]: time="2026-01-24T00:45:53.367329188Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-dfsrt,Uid:0a9fe640-bc48-43ab-ab91-0adb30a0cc77,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"2c3827f361a92df946641a636b4ba12503a0149639fdf0953f0fad54e2cd65c3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 24 00:45:53.367902 kubelet[2863]: E0124 00:45:53.367867 2863 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2c3827f361a92df946641a636b4ba12503a0149639fdf0953f0fad54e2cd65c3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 24 00:45:53.368031 kubelet[2863]: E0124 00:45:53.367902 2863 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2c3827f361a92df946641a636b4ba12503a0149639fdf0953f0fad54e2cd65c3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-dfsrt" Jan 24 00:45:53.368031 kubelet[2863]: E0124 00:45:53.367918 2863 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2c3827f361a92df946641a636b4ba12503a0149639fdf0953f0fad54e2cd65c3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-dfsrt" Jan 24 00:45:53.368031 kubelet[2863]: E0124 00:45:53.367952 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-dfsrt_kube-system(0a9fe640-bc48-43ab-ab91-0adb30a0cc77)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-dfsrt_kube-system(0a9fe640-bc48-43ab-ab91-0adb30a0cc77)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2c3827f361a92df946641a636b4ba12503a0149639fdf0953f0fad54e2cd65c3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-dfsrt" podUID="0a9fe640-bc48-43ab-ab91-0adb30a0cc77" Jan 24 00:45:53.369222 containerd[1682]: time="2026-01-24T00:45:53.369190487Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-x8d2s,Uid:7f18d0f4-a1ee-4594-a309-04f492279d0c,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"94b68cca43728c953742b0d8678d9d1e3711443258fe9ba2311daa1cb16f422b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 24 00:45:53.369516 kubelet[2863]: E0124 00:45:53.369347 2863 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"94b68cca43728c953742b0d8678d9d1e3711443258fe9ba2311daa1cb16f422b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 24 00:45:53.369516 kubelet[2863]: E0124 00:45:53.369390 2863 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"94b68cca43728c953742b0d8678d9d1e3711443258fe9ba2311daa1cb16f422b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-x8d2s" Jan 24 00:45:53.369516 kubelet[2863]: E0124 00:45:53.369406 2863 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"94b68cca43728c953742b0d8678d9d1e3711443258fe9ba2311daa1cb16f422b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-x8d2s" Jan 24 00:45:53.369588 kubelet[2863]: E0124 00:45:53.369448 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-x8d2s_kube-system(7f18d0f4-a1ee-4594-a309-04f492279d0c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-x8d2s_kube-system(7f18d0f4-a1ee-4594-a309-04f492279d0c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"94b68cca43728c953742b0d8678d9d1e3711443258fe9ba2311daa1cb16f422b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-x8d2s" podUID="7f18d0f4-a1ee-4594-a309-04f492279d0c" Jan 24 00:45:53.629931 systemd[1]: Created slice kubepods-besteffort-pod12f6abfd_9f5a_45d2_b23b_65ea3c59cfbc.slice - libcontainer container kubepods-besteffort-pod12f6abfd_9f5a_45d2_b23b_65ea3c59cfbc.slice. Jan 24 00:45:53.634741 containerd[1682]: time="2026-01-24T00:45:53.634676995Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-njf24,Uid:12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc,Namespace:calico-system,Attempt:0,}" Jan 24 00:45:53.733477 containerd[1682]: time="2026-01-24T00:45:53.733187129Z" level=error msg="Failed to destroy network for sandbox \"445badda22e4c172c847f407e0c4a62d9f54eb8ec129553bfaf215f5bc3c20a7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 24 00:45:53.736983 containerd[1682]: time="2026-01-24T00:45:53.736870396Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-njf24,Uid:12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"445badda22e4c172c847f407e0c4a62d9f54eb8ec129553bfaf215f5bc3c20a7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 24 00:45:53.738297 kubelet[2863]: E0124 00:45:53.738212 2863 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"445badda22e4c172c847f407e0c4a62d9f54eb8ec129553bfaf215f5bc3c20a7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 24 00:45:53.738467 kubelet[2863]: E0124 00:45:53.738291 2863 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"445badda22e4c172c847f407e0c4a62d9f54eb8ec129553bfaf215f5bc3c20a7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-njf24" Jan 24 00:45:53.738467 kubelet[2863]: E0124 00:45:53.738348 2863 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"445badda22e4c172c847f407e0c4a62d9f54eb8ec129553bfaf215f5bc3c20a7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-njf24" Jan 24 00:45:53.739185 kubelet[2863]: E0124 00:45:53.738654 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-njf24_calico-system(12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-njf24_calico-system(12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"445badda22e4c172c847f407e0c4a62d9f54eb8ec129553bfaf215f5bc3c20a7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:45:53.752041 containerd[1682]: time="2026-01-24T00:45:53.751737245Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\"" Jan 24 00:46:01.112768 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3027892319.mount: Deactivated successfully. Jan 24 00:46:01.143447 containerd[1682]: time="2026-01-24T00:46:01.143387945Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 24 00:46:01.144616 containerd[1682]: time="2026-01-24T00:46:01.144550085Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.4: active requests=0, bytes read=156880025" Jan 24 00:46:01.147935 containerd[1682]: time="2026-01-24T00:46:01.147898864Z" level=info msg="ImageCreate event name:\"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 24 00:46:01.152496 containerd[1682]: time="2026-01-24T00:46:01.150814814Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 24 00:46:01.152496 containerd[1682]: time="2026-01-24T00:46:01.152186433Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.4\" with image id \"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\", size \"156883537\" in 7.400263319s" Jan 24 00:46:01.152496 containerd[1682]: time="2026-01-24T00:46:01.152206533Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\" returns image reference \"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\"" Jan 24 00:46:01.167605 containerd[1682]: time="2026-01-24T00:46:01.167551582Z" level=info msg="CreateContainer within sandbox \"006723ff522a1ec89aa42472e96a2a108256b3c2490e6db62e86e8b1e883a13e\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Jan 24 00:46:01.178110 containerd[1682]: time="2026-01-24T00:46:01.176080571Z" level=info msg="Container 05ed700eca5f54aebe759ead0e7523876cc38ecc531f60cbc1f84f3d8c3a2b84: CDI devices from CRI Config.CDIDevices: []" Jan 24 00:46:01.184008 containerd[1682]: time="2026-01-24T00:46:01.183975450Z" level=info msg="CreateContainer within sandbox \"006723ff522a1ec89aa42472e96a2a108256b3c2490e6db62e86e8b1e883a13e\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"05ed700eca5f54aebe759ead0e7523876cc38ecc531f60cbc1f84f3d8c3a2b84\"" Jan 24 00:46:01.185576 containerd[1682]: time="2026-01-24T00:46:01.184567239Z" level=info msg="StartContainer for \"05ed700eca5f54aebe759ead0e7523876cc38ecc531f60cbc1f84f3d8c3a2b84\"" Jan 24 00:46:01.185576 containerd[1682]: time="2026-01-24T00:46:01.185495540Z" level=info msg="connecting to shim 05ed700eca5f54aebe759ead0e7523876cc38ecc531f60cbc1f84f3d8c3a2b84" address="unix:///run/containerd/s/8e909fb5a93aa971aa66fc790686e79852f192cb3166895f4fc8c6f8bd071b5f" protocol=ttrpc version=3 Jan 24 00:46:01.225197 systemd[1]: Started cri-containerd-05ed700eca5f54aebe759ead0e7523876cc38ecc531f60cbc1f84f3d8c3a2b84.scope - libcontainer container 05ed700eca5f54aebe759ead0e7523876cc38ecc531f60cbc1f84f3d8c3a2b84. Jan 24 00:46:01.281000 audit: BPF prog-id=172 op=LOAD Jan 24 00:46:01.284736 kernel: kauditd_printk_skb: 6 callbacks suppressed Jan 24 00:46:01.284792 kernel: audit: type=1334 audit(1769215561.281:574): prog-id=172 op=LOAD Jan 24 00:46:01.293726 kernel: audit: type=1300 audit(1769215561.281:574): arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000106488 a2=98 a3=0 items=0 ppid=3412 pid=3944 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:01.281000 audit[3944]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000106488 a2=98 a3=0 items=0 ppid=3412 pid=3944 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:01.281000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3035656437303065636135663534616562653735396561643065373532 Jan 24 00:46:01.324159 kernel: audit: type=1327 audit(1769215561.281:574): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3035656437303065636135663534616562653735396561643065373532 Jan 24 00:46:01.324232 kernel: audit: type=1334 audit(1769215561.281:575): prog-id=173 op=LOAD Jan 24 00:46:01.281000 audit: BPF prog-id=173 op=LOAD Jan 24 00:46:01.339101 kernel: audit: type=1300 audit(1769215561.281:575): arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000106218 a2=98 a3=0 items=0 ppid=3412 pid=3944 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:01.281000 audit[3944]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000106218 a2=98 a3=0 items=0 ppid=3412 pid=3944 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:01.281000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3035656437303065636135663534616562653735396561643065373532 Jan 24 00:46:01.345001 kernel: audit: type=1327 audit(1769215561.281:575): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3035656437303065636135663534616562653735396561643065373532 Jan 24 00:46:01.349209 containerd[1682]: time="2026-01-24T00:46:01.349168060Z" level=info msg="StartContainer for \"05ed700eca5f54aebe759ead0e7523876cc38ecc531f60cbc1f84f3d8c3a2b84\" returns successfully" Jan 24 00:46:01.281000 audit: BPF prog-id=173 op=UNLOAD Jan 24 00:46:01.351538 kernel: audit: type=1334 audit(1769215561.281:576): prog-id=173 op=UNLOAD Jan 24 00:46:01.281000 audit[3944]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3412 pid=3944 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:01.355653 kernel: audit: type=1300 audit(1769215561.281:576): arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3412 pid=3944 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:01.281000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3035656437303065636135663534616562653735396561643065373532 Jan 24 00:46:01.362983 kernel: audit: type=1327 audit(1769215561.281:576): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3035656437303065636135663534616562653735396561643065373532 Jan 24 00:46:01.281000 audit: BPF prog-id=172 op=UNLOAD Jan 24 00:46:01.369162 kernel: audit: type=1334 audit(1769215561.281:577): prog-id=172 op=UNLOAD Jan 24 00:46:01.281000 audit[3944]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3412 pid=3944 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:01.281000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3035656437303065636135663534616562653735396561643065373532 Jan 24 00:46:01.281000 audit: BPF prog-id=174 op=LOAD Jan 24 00:46:01.281000 audit[3944]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001066e8 a2=98 a3=0 items=0 ppid=3412 pid=3944 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:01.281000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3035656437303065636135663534616562653735396561643065373532 Jan 24 00:46:01.429540 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Jan 24 00:46:01.429783 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Jan 24 00:46:01.620622 kubelet[2863]: I0124 00:46:01.620445 2863 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4461b4fb-8066-4a43-ad64-18be27337144-whisker-ca-bundle\") pod \"4461b4fb-8066-4a43-ad64-18be27337144\" (UID: \"4461b4fb-8066-4a43-ad64-18be27337144\") " Jan 24 00:46:01.620622 kubelet[2863]: I0124 00:46:01.620490 2863 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ng2x\" (UniqueName: \"kubernetes.io/projected/4461b4fb-8066-4a43-ad64-18be27337144-kube-api-access-7ng2x\") pod \"4461b4fb-8066-4a43-ad64-18be27337144\" (UID: \"4461b4fb-8066-4a43-ad64-18be27337144\") " Jan 24 00:46:01.620622 kubelet[2863]: I0124 00:46:01.620513 2863 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/4461b4fb-8066-4a43-ad64-18be27337144-whisker-backend-key-pair\") pod \"4461b4fb-8066-4a43-ad64-18be27337144\" (UID: \"4461b4fb-8066-4a43-ad64-18be27337144\") " Jan 24 00:46:01.621278 kubelet[2863]: I0124 00:46:01.621214 2863 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4461b4fb-8066-4a43-ad64-18be27337144-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "4461b4fb-8066-4a43-ad64-18be27337144" (UID: "4461b4fb-8066-4a43-ad64-18be27337144"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Jan 24 00:46:01.625634 kubelet[2863]: I0124 00:46:01.625602 2863 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4461b4fb-8066-4a43-ad64-18be27337144-kube-api-access-7ng2x" (OuterVolumeSpecName: "kube-api-access-7ng2x") pod "4461b4fb-8066-4a43-ad64-18be27337144" (UID: "4461b4fb-8066-4a43-ad64-18be27337144"). InnerVolumeSpecName "kube-api-access-7ng2x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Jan 24 00:46:01.625815 kubelet[2863]: I0124 00:46:01.625794 2863 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4461b4fb-8066-4a43-ad64-18be27337144-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "4461b4fb-8066-4a43-ad64-18be27337144" (UID: "4461b4fb-8066-4a43-ad64-18be27337144"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Jan 24 00:46:01.721175 kubelet[2863]: I0124 00:46:01.721129 2863 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/4461b4fb-8066-4a43-ad64-18be27337144-whisker-backend-key-pair\") on node \"ci-4593-0-0-9-1308b066bf\" DevicePath \"\"" Jan 24 00:46:01.721175 kubelet[2863]: I0124 00:46:01.721175 2863 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4461b4fb-8066-4a43-ad64-18be27337144-whisker-ca-bundle\") on node \"ci-4593-0-0-9-1308b066bf\" DevicePath \"\"" Jan 24 00:46:01.721175 kubelet[2863]: I0124 00:46:01.721184 2863 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7ng2x\" (UniqueName: \"kubernetes.io/projected/4461b4fb-8066-4a43-ad64-18be27337144-kube-api-access-7ng2x\") on node \"ci-4593-0-0-9-1308b066bf\" DevicePath \"\"" Jan 24 00:46:01.795909 systemd[1]: Removed slice kubepods-besteffort-pod4461b4fb_8066_4a43_ad64_18be27337144.slice - libcontainer container kubepods-besteffort-pod4461b4fb_8066_4a43_ad64_18be27337144.slice. Jan 24 00:46:01.814493 kubelet[2863]: I0124 00:46:01.814242 2863 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-5qjb2" podStartSLOduration=1.761633196 podStartE2EDuration="20.814220075s" podCreationTimestamp="2026-01-24 00:45:41 +0000 UTC" firstStartedPulling="2026-01-24 00:45:42.100279765 +0000 UTC m=+18.632266592" lastFinishedPulling="2026-01-24 00:46:01.152866654 +0000 UTC m=+37.684853471" observedRunningTime="2026-01-24 00:46:01.809575995 +0000 UTC m=+38.341562862" watchObservedRunningTime="2026-01-24 00:46:01.814220075 +0000 UTC m=+38.346206932" Jan 24 00:46:01.893359 systemd[1]: Created slice kubepods-besteffort-poddbbc15c6_c8eb_4fd5_85b0_95f3e37249b5.slice - libcontainer container kubepods-besteffort-poddbbc15c6_c8eb_4fd5_85b0_95f3e37249b5.slice. Jan 24 00:46:01.922494 kubelet[2863]: I0124 00:46:01.922442 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlnt8\" (UniqueName: \"kubernetes.io/projected/dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5-kube-api-access-dlnt8\") pod \"whisker-58c54c478-wd6fd\" (UID: \"dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5\") " pod="calico-system/whisker-58c54c478-wd6fd" Jan 24 00:46:01.922494 kubelet[2863]: I0124 00:46:01.922494 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5-whisker-backend-key-pair\") pod \"whisker-58c54c478-wd6fd\" (UID: \"dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5\") " pod="calico-system/whisker-58c54c478-wd6fd" Jan 24 00:46:01.922653 kubelet[2863]: I0124 00:46:01.922510 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5-whisker-ca-bundle\") pod \"whisker-58c54c478-wd6fd\" (UID: \"dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5\") " pod="calico-system/whisker-58c54c478-wd6fd" Jan 24 00:46:02.115514 systemd[1]: var-lib-kubelet-pods-4461b4fb\x2d8066\x2d4a43\x2dad64\x2d18be27337144-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d7ng2x.mount: Deactivated successfully. Jan 24 00:46:02.115726 systemd[1]: var-lib-kubelet-pods-4461b4fb\x2d8066\x2d4a43\x2dad64\x2d18be27337144-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Jan 24 00:46:02.201335 containerd[1682]: time="2026-01-24T00:46:02.201054610Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-58c54c478-wd6fd,Uid:dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5,Namespace:calico-system,Attempt:0,}" Jan 24 00:46:02.347269 systemd-networkd[1553]: cali05885af1c4e: Link UP Jan 24 00:46:02.348085 systemd-networkd[1553]: cali05885af1c4e: Gained carrier Jan 24 00:46:02.373793 containerd[1682]: 2026-01-24 00:46:02.229 [INFO][4034] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 24 00:46:02.373793 containerd[1682]: 2026-01-24 00:46:02.262 [INFO][4034] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4593--0--0--9--1308b066bf-k8s-whisker--58c54c478--wd6fd-eth0 whisker-58c54c478- calico-system dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5 911 0 2026-01-24 00:46:01 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:58c54c478 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4593-0-0-9-1308b066bf whisker-58c54c478-wd6fd eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali05885af1c4e [] [] }} ContainerID="da5909e15b2bf97632274e14e9528bbe67a8307c303dc203f262ecc1d70b15b7" Namespace="calico-system" Pod="whisker-58c54c478-wd6fd" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-whisker--58c54c478--wd6fd-" Jan 24 00:46:02.373793 containerd[1682]: 2026-01-24 00:46:02.263 [INFO][4034] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="da5909e15b2bf97632274e14e9528bbe67a8307c303dc203f262ecc1d70b15b7" Namespace="calico-system" Pod="whisker-58c54c478-wd6fd" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-whisker--58c54c478--wd6fd-eth0" Jan 24 00:46:02.373793 containerd[1682]: 2026-01-24 00:46:02.284 [INFO][4045] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="da5909e15b2bf97632274e14e9528bbe67a8307c303dc203f262ecc1d70b15b7" HandleID="k8s-pod-network.da5909e15b2bf97632274e14e9528bbe67a8307c303dc203f262ecc1d70b15b7" Workload="ci--4593--0--0--9--1308b066bf-k8s-whisker--58c54c478--wd6fd-eth0" Jan 24 00:46:02.374292 containerd[1682]: 2026-01-24 00:46:02.284 [INFO][4045] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="da5909e15b2bf97632274e14e9528bbe67a8307c303dc203f262ecc1d70b15b7" HandleID="k8s-pod-network.da5909e15b2bf97632274e14e9528bbe67a8307c303dc203f262ecc1d70b15b7" Workload="ci--4593--0--0--9--1308b066bf-k8s-whisker--58c54c478--wd6fd-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f200), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4593-0-0-9-1308b066bf", "pod":"whisker-58c54c478-wd6fd", "timestamp":"2026-01-24 00:46:02.284691415 +0000 UTC"}, Hostname:"ci-4593-0-0-9-1308b066bf", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 24 00:46:02.374292 containerd[1682]: 2026-01-24 00:46:02.284 [INFO][4045] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 24 00:46:02.374292 containerd[1682]: 2026-01-24 00:46:02.284 [INFO][4045] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 24 00:46:02.374292 containerd[1682]: 2026-01-24 00:46:02.285 [INFO][4045] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4593-0-0-9-1308b066bf' Jan 24 00:46:02.374292 containerd[1682]: 2026-01-24 00:46:02.290 [INFO][4045] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.da5909e15b2bf97632274e14e9528bbe67a8307c303dc203f262ecc1d70b15b7" host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:02.374292 containerd[1682]: 2026-01-24 00:46:02.295 [INFO][4045] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:02.374292 containerd[1682]: 2026-01-24 00:46:02.299 [INFO][4045] ipam/ipam.go 511: Trying affinity for 192.168.51.64/26 host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:02.374292 containerd[1682]: 2026-01-24 00:46:02.301 [INFO][4045] ipam/ipam.go 158: Attempting to load block cidr=192.168.51.64/26 host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:02.374292 containerd[1682]: 2026-01-24 00:46:02.305 [INFO][4045] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.51.64/26 host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:02.374429 containerd[1682]: 2026-01-24 00:46:02.305 [INFO][4045] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.51.64/26 handle="k8s-pod-network.da5909e15b2bf97632274e14e9528bbe67a8307c303dc203f262ecc1d70b15b7" host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:02.374429 containerd[1682]: 2026-01-24 00:46:02.307 [INFO][4045] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.da5909e15b2bf97632274e14e9528bbe67a8307c303dc203f262ecc1d70b15b7 Jan 24 00:46:02.374429 containerd[1682]: 2026-01-24 00:46:02.315 [INFO][4045] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.51.64/26 handle="k8s-pod-network.da5909e15b2bf97632274e14e9528bbe67a8307c303dc203f262ecc1d70b15b7" host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:02.374429 containerd[1682]: 2026-01-24 00:46:02.323 [INFO][4045] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.51.65/26] block=192.168.51.64/26 handle="k8s-pod-network.da5909e15b2bf97632274e14e9528bbe67a8307c303dc203f262ecc1d70b15b7" host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:02.374429 containerd[1682]: 2026-01-24 00:46:02.323 [INFO][4045] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.51.65/26] handle="k8s-pod-network.da5909e15b2bf97632274e14e9528bbe67a8307c303dc203f262ecc1d70b15b7" host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:02.374429 containerd[1682]: 2026-01-24 00:46:02.323 [INFO][4045] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 24 00:46:02.374429 containerd[1682]: 2026-01-24 00:46:02.323 [INFO][4045] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.51.65/26] IPv6=[] ContainerID="da5909e15b2bf97632274e14e9528bbe67a8307c303dc203f262ecc1d70b15b7" HandleID="k8s-pod-network.da5909e15b2bf97632274e14e9528bbe67a8307c303dc203f262ecc1d70b15b7" Workload="ci--4593--0--0--9--1308b066bf-k8s-whisker--58c54c478--wd6fd-eth0" Jan 24 00:46:02.374534 containerd[1682]: 2026-01-24 00:46:02.331 [INFO][4034] cni-plugin/k8s.go 418: Populated endpoint ContainerID="da5909e15b2bf97632274e14e9528bbe67a8307c303dc203f262ecc1d70b15b7" Namespace="calico-system" Pod="whisker-58c54c478-wd6fd" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-whisker--58c54c478--wd6fd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--9--1308b066bf-k8s-whisker--58c54c478--wd6fd-eth0", GenerateName:"whisker-58c54c478-", Namespace:"calico-system", SelfLink:"", UID:"dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5", ResourceVersion:"911", Generation:0, CreationTimestamp:time.Date(2026, time.January, 24, 0, 46, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"58c54c478", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-9-1308b066bf", ContainerID:"", Pod:"whisker-58c54c478-wd6fd", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.51.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali05885af1c4e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 24 00:46:02.374534 containerd[1682]: 2026-01-24 00:46:02.332 [INFO][4034] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.51.65/32] ContainerID="da5909e15b2bf97632274e14e9528bbe67a8307c303dc203f262ecc1d70b15b7" Namespace="calico-system" Pod="whisker-58c54c478-wd6fd" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-whisker--58c54c478--wd6fd-eth0" Jan 24 00:46:02.374584 containerd[1682]: 2026-01-24 00:46:02.332 [INFO][4034] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali05885af1c4e ContainerID="da5909e15b2bf97632274e14e9528bbe67a8307c303dc203f262ecc1d70b15b7" Namespace="calico-system" Pod="whisker-58c54c478-wd6fd" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-whisker--58c54c478--wd6fd-eth0" Jan 24 00:46:02.374584 containerd[1682]: 2026-01-24 00:46:02.348 [INFO][4034] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="da5909e15b2bf97632274e14e9528bbe67a8307c303dc203f262ecc1d70b15b7" Namespace="calico-system" Pod="whisker-58c54c478-wd6fd" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-whisker--58c54c478--wd6fd-eth0" Jan 24 00:46:02.374634 containerd[1682]: 2026-01-24 00:46:02.350 [INFO][4034] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="da5909e15b2bf97632274e14e9528bbe67a8307c303dc203f262ecc1d70b15b7" Namespace="calico-system" Pod="whisker-58c54c478-wd6fd" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-whisker--58c54c478--wd6fd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--9--1308b066bf-k8s-whisker--58c54c478--wd6fd-eth0", GenerateName:"whisker-58c54c478-", Namespace:"calico-system", SelfLink:"", UID:"dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5", ResourceVersion:"911", Generation:0, CreationTimestamp:time.Date(2026, time.January, 24, 0, 46, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"58c54c478", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-9-1308b066bf", ContainerID:"da5909e15b2bf97632274e14e9528bbe67a8307c303dc203f262ecc1d70b15b7", Pod:"whisker-58c54c478-wd6fd", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.51.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali05885af1c4e", MAC:"9e:37:c9:9b:44:9a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 24 00:46:02.374713 containerd[1682]: 2026-01-24 00:46:02.365 [INFO][4034] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="da5909e15b2bf97632274e14e9528bbe67a8307c303dc203f262ecc1d70b15b7" Namespace="calico-system" Pod="whisker-58c54c478-wd6fd" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-whisker--58c54c478--wd6fd-eth0" Jan 24 00:46:02.440342 containerd[1682]: time="2026-01-24T00:46:02.440281855Z" level=info msg="connecting to shim da5909e15b2bf97632274e14e9528bbe67a8307c303dc203f262ecc1d70b15b7" address="unix:///run/containerd/s/a564772419db7f8c30770f7bde9e690ec409ea565e576b8f20d753201928929f" namespace=k8s.io protocol=ttrpc version=3 Jan 24 00:46:02.483224 systemd[1]: Started cri-containerd-da5909e15b2bf97632274e14e9528bbe67a8307c303dc203f262ecc1d70b15b7.scope - libcontainer container da5909e15b2bf97632274e14e9528bbe67a8307c303dc203f262ecc1d70b15b7. Jan 24 00:46:02.501000 audit: BPF prog-id=175 op=LOAD Jan 24 00:46:02.501000 audit: BPF prog-id=176 op=LOAD Jan 24 00:46:02.501000 audit[4079]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b0238 a2=98 a3=0 items=0 ppid=4069 pid=4079 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:02.501000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6461353930396531356232626639373633323237346531346539353238 Jan 24 00:46:02.501000 audit: BPF prog-id=176 op=UNLOAD Jan 24 00:46:02.501000 audit[4079]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4069 pid=4079 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:02.501000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6461353930396531356232626639373633323237346531346539353238 Jan 24 00:46:02.501000 audit: BPF prog-id=177 op=LOAD Jan 24 00:46:02.501000 audit[4079]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b0488 a2=98 a3=0 items=0 ppid=4069 pid=4079 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:02.501000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6461353930396531356232626639373633323237346531346539353238 Jan 24 00:46:02.502000 audit: BPF prog-id=178 op=LOAD Jan 24 00:46:02.502000 audit[4079]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001b0218 a2=98 a3=0 items=0 ppid=4069 pid=4079 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:02.502000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6461353930396531356232626639373633323237346531346539353238 Jan 24 00:46:02.502000 audit: BPF prog-id=178 op=UNLOAD Jan 24 00:46:02.502000 audit[4079]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4069 pid=4079 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:02.502000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6461353930396531356232626639373633323237346531346539353238 Jan 24 00:46:02.502000 audit: BPF prog-id=177 op=UNLOAD Jan 24 00:46:02.502000 audit[4079]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4069 pid=4079 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:02.502000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6461353930396531356232626639373633323237346531346539353238 Jan 24 00:46:02.502000 audit: BPF prog-id=179 op=LOAD Jan 24 00:46:02.502000 audit[4079]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b06e8 a2=98 a3=0 items=0 ppid=4069 pid=4079 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:02.502000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6461353930396531356232626639373633323237346531346539353238 Jan 24 00:46:02.547880 containerd[1682]: time="2026-01-24T00:46:02.547814709Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-58c54c478-wd6fd,Uid:dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5,Namespace:calico-system,Attempt:0,} returns sandbox id \"da5909e15b2bf97632274e14e9528bbe67a8307c303dc203f262ecc1d70b15b7\"" Jan 24 00:46:02.549708 containerd[1682]: time="2026-01-24T00:46:02.549666739Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 24 00:46:02.976607 containerd[1682]: time="2026-01-24T00:46:02.976451153Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 24 00:46:02.980148 containerd[1682]: time="2026-01-24T00:46:02.980042283Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 24 00:46:02.980148 containerd[1682]: time="2026-01-24T00:46:02.980128443Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 24 00:46:02.980468 kubelet[2863]: E0124 00:46:02.980399 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 24 00:46:02.980468 kubelet[2863]: E0124 00:46:02.980443 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 24 00:46:02.983441 kubelet[2863]: E0124 00:46:02.983384 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:c13ad1f9d9cd499a81c814dc308eb491,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dlnt8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-58c54c478-wd6fd_calico-system(dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 24 00:46:02.985608 containerd[1682]: time="2026-01-24T00:46:02.985537333Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 24 00:46:03.423142 containerd[1682]: time="2026-01-24T00:46:03.422799640Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 24 00:46:03.424781 containerd[1682]: time="2026-01-24T00:46:03.424681170Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 24 00:46:03.424781 containerd[1682]: time="2026-01-24T00:46:03.424788240Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 24 00:46:03.425269 kubelet[2863]: E0124 00:46:03.424986 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 24 00:46:03.425269 kubelet[2863]: E0124 00:46:03.425056 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 24 00:46:03.425508 kubelet[2863]: E0124 00:46:03.425309 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dlnt8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-58c54c478-wd6fd_calico-system(dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 24 00:46:03.427917 kubelet[2863]: E0124 00:46:03.427800 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58c54c478-wd6fd" podUID="dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5" Jan 24 00:46:03.515688 systemd-networkd[1553]: cali05885af1c4e: Gained IPv6LL Jan 24 00:46:03.621321 kubelet[2863]: I0124 00:46:03.621214 2863 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4461b4fb-8066-4a43-ad64-18be27337144" path="/var/lib/kubelet/pods/4461b4fb-8066-4a43-ad64-18be27337144/volumes" Jan 24 00:46:03.789240 kubelet[2863]: E0124 00:46:03.789039 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58c54c478-wd6fd" podUID="dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5" Jan 24 00:46:03.827000 audit[4225]: NETFILTER_CFG table=filter:117 family=2 entries=22 op=nft_register_rule pid=4225 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 24 00:46:03.827000 audit[4225]: SYSCALL arch=c000003e syscall=46 success=yes exit=8224 a0=3 a1=7fff06acf0b0 a2=0 a3=7fff06acf09c items=0 ppid=2972 pid=4225 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:03.827000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 24 00:46:03.831000 audit[4225]: NETFILTER_CFG table=nat:118 family=2 entries=12 op=nft_register_rule pid=4225 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 24 00:46:03.831000 audit[4225]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7fff06acf0b0 a2=0 a3=0 items=0 ppid=2972 pid=4225 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:03.831000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 24 00:46:04.616297 containerd[1682]: time="2026-01-24T00:46:04.616204137Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-x8d2s,Uid:7f18d0f4-a1ee-4594-a309-04f492279d0c,Namespace:kube-system,Attempt:0,}" Jan 24 00:46:04.764149 systemd-networkd[1553]: cali0ad53b444da: Link UP Jan 24 00:46:04.764723 systemd-networkd[1553]: cali0ad53b444da: Gained carrier Jan 24 00:46:04.777543 containerd[1682]: 2026-01-24 00:46:04.672 [INFO][4247] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 24 00:46:04.777543 containerd[1682]: 2026-01-24 00:46:04.694 [INFO][4247] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4593--0--0--9--1308b066bf-k8s-coredns--674b8bbfcf--x8d2s-eth0 coredns-674b8bbfcf- kube-system 7f18d0f4-a1ee-4594-a309-04f492279d0c 840 0 2026-01-24 00:45:28 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4593-0-0-9-1308b066bf coredns-674b8bbfcf-x8d2s eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali0ad53b444da [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="320f1c06e6c1ca97896ac9667df0e7ab4461e9915a795a29f54090f52b21888e" Namespace="kube-system" Pod="coredns-674b8bbfcf-x8d2s" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-coredns--674b8bbfcf--x8d2s-" Jan 24 00:46:04.777543 containerd[1682]: 2026-01-24 00:46:04.694 [INFO][4247] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="320f1c06e6c1ca97896ac9667df0e7ab4461e9915a795a29f54090f52b21888e" Namespace="kube-system" Pod="coredns-674b8bbfcf-x8d2s" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-coredns--674b8bbfcf--x8d2s-eth0" Jan 24 00:46:04.777543 containerd[1682]: 2026-01-24 00:46:04.719 [INFO][4260] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="320f1c06e6c1ca97896ac9667df0e7ab4461e9915a795a29f54090f52b21888e" HandleID="k8s-pod-network.320f1c06e6c1ca97896ac9667df0e7ab4461e9915a795a29f54090f52b21888e" Workload="ci--4593--0--0--9--1308b066bf-k8s-coredns--674b8bbfcf--x8d2s-eth0" Jan 24 00:46:04.777926 containerd[1682]: 2026-01-24 00:46:04.719 [INFO][4260] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="320f1c06e6c1ca97896ac9667df0e7ab4461e9915a795a29f54090f52b21888e" HandleID="k8s-pod-network.320f1c06e6c1ca97896ac9667df0e7ab4461e9915a795a29f54090f52b21888e" Workload="ci--4593--0--0--9--1308b066bf-k8s-coredns--674b8bbfcf--x8d2s-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5800), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4593-0-0-9-1308b066bf", "pod":"coredns-674b8bbfcf-x8d2s", "timestamp":"2026-01-24 00:46:04.719343092 +0000 UTC"}, Hostname:"ci-4593-0-0-9-1308b066bf", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 24 00:46:04.777926 containerd[1682]: 2026-01-24 00:46:04.719 [INFO][4260] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 24 00:46:04.777926 containerd[1682]: 2026-01-24 00:46:04.719 [INFO][4260] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 24 00:46:04.777926 containerd[1682]: 2026-01-24 00:46:04.719 [INFO][4260] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4593-0-0-9-1308b066bf' Jan 24 00:46:04.777926 containerd[1682]: 2026-01-24 00:46:04.729 [INFO][4260] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.320f1c06e6c1ca97896ac9667df0e7ab4461e9915a795a29f54090f52b21888e" host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:04.777926 containerd[1682]: 2026-01-24 00:46:04.735 [INFO][4260] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:04.777926 containerd[1682]: 2026-01-24 00:46:04.739 [INFO][4260] ipam/ipam.go 511: Trying affinity for 192.168.51.64/26 host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:04.777926 containerd[1682]: 2026-01-24 00:46:04.740 [INFO][4260] ipam/ipam.go 158: Attempting to load block cidr=192.168.51.64/26 host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:04.777926 containerd[1682]: 2026-01-24 00:46:04.742 [INFO][4260] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.51.64/26 host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:04.778871 containerd[1682]: 2026-01-24 00:46:04.742 [INFO][4260] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.51.64/26 handle="k8s-pod-network.320f1c06e6c1ca97896ac9667df0e7ab4461e9915a795a29f54090f52b21888e" host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:04.778871 containerd[1682]: 2026-01-24 00:46:04.744 [INFO][4260] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.320f1c06e6c1ca97896ac9667df0e7ab4461e9915a795a29f54090f52b21888e Jan 24 00:46:04.778871 containerd[1682]: 2026-01-24 00:46:04.747 [INFO][4260] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.51.64/26 handle="k8s-pod-network.320f1c06e6c1ca97896ac9667df0e7ab4461e9915a795a29f54090f52b21888e" host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:04.778871 containerd[1682]: 2026-01-24 00:46:04.753 [INFO][4260] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.51.66/26] block=192.168.51.64/26 handle="k8s-pod-network.320f1c06e6c1ca97896ac9667df0e7ab4461e9915a795a29f54090f52b21888e" host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:04.778871 containerd[1682]: 2026-01-24 00:46:04.753 [INFO][4260] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.51.66/26] handle="k8s-pod-network.320f1c06e6c1ca97896ac9667df0e7ab4461e9915a795a29f54090f52b21888e" host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:04.778871 containerd[1682]: 2026-01-24 00:46:04.753 [INFO][4260] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 24 00:46:04.778871 containerd[1682]: 2026-01-24 00:46:04.753 [INFO][4260] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.51.66/26] IPv6=[] ContainerID="320f1c06e6c1ca97896ac9667df0e7ab4461e9915a795a29f54090f52b21888e" HandleID="k8s-pod-network.320f1c06e6c1ca97896ac9667df0e7ab4461e9915a795a29f54090f52b21888e" Workload="ci--4593--0--0--9--1308b066bf-k8s-coredns--674b8bbfcf--x8d2s-eth0" Jan 24 00:46:04.779484 containerd[1682]: 2026-01-24 00:46:04.757 [INFO][4247] cni-plugin/k8s.go 418: Populated endpoint ContainerID="320f1c06e6c1ca97896ac9667df0e7ab4461e9915a795a29f54090f52b21888e" Namespace="kube-system" Pod="coredns-674b8bbfcf-x8d2s" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-coredns--674b8bbfcf--x8d2s-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--9--1308b066bf-k8s-coredns--674b8bbfcf--x8d2s-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"7f18d0f4-a1ee-4594-a309-04f492279d0c", ResourceVersion:"840", Generation:0, CreationTimestamp:time.Date(2026, time.January, 24, 0, 45, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-9-1308b066bf", ContainerID:"", Pod:"coredns-674b8bbfcf-x8d2s", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.51.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali0ad53b444da", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 24 00:46:04.779484 containerd[1682]: 2026-01-24 00:46:04.757 [INFO][4247] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.51.66/32] ContainerID="320f1c06e6c1ca97896ac9667df0e7ab4461e9915a795a29f54090f52b21888e" Namespace="kube-system" Pod="coredns-674b8bbfcf-x8d2s" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-coredns--674b8bbfcf--x8d2s-eth0" Jan 24 00:46:04.779484 containerd[1682]: 2026-01-24 00:46:04.757 [INFO][4247] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0ad53b444da ContainerID="320f1c06e6c1ca97896ac9667df0e7ab4461e9915a795a29f54090f52b21888e" Namespace="kube-system" Pod="coredns-674b8bbfcf-x8d2s" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-coredns--674b8bbfcf--x8d2s-eth0" Jan 24 00:46:04.779484 containerd[1682]: 2026-01-24 00:46:04.759 [INFO][4247] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="320f1c06e6c1ca97896ac9667df0e7ab4461e9915a795a29f54090f52b21888e" Namespace="kube-system" Pod="coredns-674b8bbfcf-x8d2s" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-coredns--674b8bbfcf--x8d2s-eth0" Jan 24 00:46:04.779484 containerd[1682]: 2026-01-24 00:46:04.760 [INFO][4247] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="320f1c06e6c1ca97896ac9667df0e7ab4461e9915a795a29f54090f52b21888e" Namespace="kube-system" Pod="coredns-674b8bbfcf-x8d2s" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-coredns--674b8bbfcf--x8d2s-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--9--1308b066bf-k8s-coredns--674b8bbfcf--x8d2s-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"7f18d0f4-a1ee-4594-a309-04f492279d0c", ResourceVersion:"840", Generation:0, CreationTimestamp:time.Date(2026, time.January, 24, 0, 45, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-9-1308b066bf", ContainerID:"320f1c06e6c1ca97896ac9667df0e7ab4461e9915a795a29f54090f52b21888e", Pod:"coredns-674b8bbfcf-x8d2s", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.51.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali0ad53b444da", MAC:"76:57:e2:b8:cc:16", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 24 00:46:04.779484 containerd[1682]: 2026-01-24 00:46:04.769 [INFO][4247] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="320f1c06e6c1ca97896ac9667df0e7ab4461e9915a795a29f54090f52b21888e" Namespace="kube-system" Pod="coredns-674b8bbfcf-x8d2s" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-coredns--674b8bbfcf--x8d2s-eth0" Jan 24 00:46:04.813907 containerd[1682]: time="2026-01-24T00:46:04.813809097Z" level=info msg="connecting to shim 320f1c06e6c1ca97896ac9667df0e7ab4461e9915a795a29f54090f52b21888e" address="unix:///run/containerd/s/b9e2ba818cd3ccd7d5fc0f8b96721e22c1dcc5398ed110cd987c48c8242e1843" namespace=k8s.io protocol=ttrpc version=3 Jan 24 00:46:04.841233 systemd[1]: Started cri-containerd-320f1c06e6c1ca97896ac9667df0e7ab4461e9915a795a29f54090f52b21888e.scope - libcontainer container 320f1c06e6c1ca97896ac9667df0e7ab4461e9915a795a29f54090f52b21888e. Jan 24 00:46:04.865000 audit: BPF prog-id=180 op=LOAD Jan 24 00:46:04.866000 audit: BPF prog-id=181 op=LOAD Jan 24 00:46:04.866000 audit[4293]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000106238 a2=98 a3=0 items=0 ppid=4282 pid=4293 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:04.866000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3332306631633036653663316361393738393661633936363764663065 Jan 24 00:46:04.866000 audit: BPF prog-id=181 op=UNLOAD Jan 24 00:46:04.866000 audit[4293]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4282 pid=4293 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:04.866000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3332306631633036653663316361393738393661633936363764663065 Jan 24 00:46:04.867000 audit: BPF prog-id=182 op=LOAD Jan 24 00:46:04.867000 audit[4293]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000106488 a2=98 a3=0 items=0 ppid=4282 pid=4293 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:04.867000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3332306631633036653663316361393738393661633936363764663065 Jan 24 00:46:04.867000 audit: BPF prog-id=183 op=LOAD Jan 24 00:46:04.867000 audit[4293]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000106218 a2=98 a3=0 items=0 ppid=4282 pid=4293 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:04.867000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3332306631633036653663316361393738393661633936363764663065 Jan 24 00:46:04.868000 audit: BPF prog-id=183 op=UNLOAD Jan 24 00:46:04.868000 audit[4293]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4282 pid=4293 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:04.868000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3332306631633036653663316361393738393661633936363764663065 Jan 24 00:46:04.868000 audit: BPF prog-id=182 op=UNLOAD Jan 24 00:46:04.868000 audit[4293]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4282 pid=4293 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:04.868000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3332306631633036653663316361393738393661633936363764663065 Jan 24 00:46:04.868000 audit: BPF prog-id=184 op=LOAD Jan 24 00:46:04.868000 audit[4293]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001066e8 a2=98 a3=0 items=0 ppid=4282 pid=4293 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:04.868000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3332306631633036653663316361393738393661633936363764663065 Jan 24 00:46:04.908382 containerd[1682]: time="2026-01-24T00:46:04.908302991Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-x8d2s,Uid:7f18d0f4-a1ee-4594-a309-04f492279d0c,Namespace:kube-system,Attempt:0,} returns sandbox id \"320f1c06e6c1ca97896ac9667df0e7ab4461e9915a795a29f54090f52b21888e\"" Jan 24 00:46:04.915627 containerd[1682]: time="2026-01-24T00:46:04.915519892Z" level=info msg="CreateContainer within sandbox \"320f1c06e6c1ca97896ac9667df0e7ab4461e9915a795a29f54090f52b21888e\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 24 00:46:04.935045 containerd[1682]: time="2026-01-24T00:46:04.932346173Z" level=info msg="Container 39a57c4af03dd69383307c2b844145b451243c446ab3d30ae72b21167e7a2312: CDI devices from CRI Config.CDIDevices: []" Jan 24 00:46:04.933705 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3614678510.mount: Deactivated successfully. Jan 24 00:46:04.940311 containerd[1682]: time="2026-01-24T00:46:04.940253713Z" level=info msg="CreateContainer within sandbox \"320f1c06e6c1ca97896ac9667df0e7ab4461e9915a795a29f54090f52b21888e\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"39a57c4af03dd69383307c2b844145b451243c446ab3d30ae72b21167e7a2312\"" Jan 24 00:46:04.942172 containerd[1682]: time="2026-01-24T00:46:04.942050013Z" level=info msg="StartContainer for \"39a57c4af03dd69383307c2b844145b451243c446ab3d30ae72b21167e7a2312\"" Jan 24 00:46:04.942953 containerd[1682]: time="2026-01-24T00:46:04.942904273Z" level=info msg="connecting to shim 39a57c4af03dd69383307c2b844145b451243c446ab3d30ae72b21167e7a2312" address="unix:///run/containerd/s/b9e2ba818cd3ccd7d5fc0f8b96721e22c1dcc5398ed110cd987c48c8242e1843" protocol=ttrpc version=3 Jan 24 00:46:04.960204 systemd[1]: Started cri-containerd-39a57c4af03dd69383307c2b844145b451243c446ab3d30ae72b21167e7a2312.scope - libcontainer container 39a57c4af03dd69383307c2b844145b451243c446ab3d30ae72b21167e7a2312. Jan 24 00:46:04.972000 audit: BPF prog-id=185 op=LOAD Jan 24 00:46:04.973000 audit: BPF prog-id=186 op=LOAD Jan 24 00:46:04.973000 audit[4323]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=4282 pid=4323 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:04.973000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3339613537633461663033646436393338333330376332623834343134 Jan 24 00:46:04.973000 audit: BPF prog-id=186 op=UNLOAD Jan 24 00:46:04.973000 audit[4323]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4282 pid=4323 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:04.973000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3339613537633461663033646436393338333330376332623834343134 Jan 24 00:46:04.973000 audit: BPF prog-id=187 op=LOAD Jan 24 00:46:04.973000 audit[4323]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=4282 pid=4323 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:04.973000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3339613537633461663033646436393338333330376332623834343134 Jan 24 00:46:04.973000 audit: BPF prog-id=188 op=LOAD Jan 24 00:46:04.973000 audit[4323]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=4282 pid=4323 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:04.973000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3339613537633461663033646436393338333330376332623834343134 Jan 24 00:46:04.973000 audit: BPF prog-id=188 op=UNLOAD Jan 24 00:46:04.973000 audit[4323]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4282 pid=4323 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:04.973000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3339613537633461663033646436393338333330376332623834343134 Jan 24 00:46:04.973000 audit: BPF prog-id=187 op=UNLOAD Jan 24 00:46:04.973000 audit[4323]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4282 pid=4323 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:04.973000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3339613537633461663033646436393338333330376332623834343134 Jan 24 00:46:04.973000 audit: BPF prog-id=189 op=LOAD Jan 24 00:46:04.973000 audit[4323]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=4282 pid=4323 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:04.973000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3339613537633461663033646436393338333330376332623834343134 Jan 24 00:46:04.994427 containerd[1682]: time="2026-01-24T00:46:04.994305605Z" level=info msg="StartContainer for \"39a57c4af03dd69383307c2b844145b451243c446ab3d30ae72b21167e7a2312\" returns successfully" Jan 24 00:46:05.617332 containerd[1682]: time="2026-01-24T00:46:05.616987966Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-58fdcd774c-w2drb,Uid:f546e732-cf0b-44c7-9678-ae1cb31a23a4,Namespace:calico-system,Attempt:0,}" Jan 24 00:46:05.619349 containerd[1682]: time="2026-01-24T00:46:05.619018836Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7779cd58b4-xp2rb,Uid:d83d59e3-6296-40d9-bb63-5a69b654ac0c,Namespace:calico-apiserver,Attempt:0,}" Jan 24 00:46:05.820902 systemd-networkd[1553]: cali31e920bd568: Link UP Jan 24 00:46:05.839278 systemd-networkd[1553]: cali31e920bd568: Gained carrier Jan 24 00:46:05.849000 audit[4416]: NETFILTER_CFG table=filter:119 family=2 entries=22 op=nft_register_rule pid=4416 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 24 00:46:05.849000 audit[4416]: SYSCALL arch=c000003e syscall=46 success=yes exit=8224 a0=3 a1=7fffb20516d0 a2=0 a3=7fffb20516bc items=0 ppid=2972 pid=4416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:05.849000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 24 00:46:05.850922 kubelet[2863]: I0124 00:46:05.850876 2863 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-x8d2s" podStartSLOduration=37.850862169 podStartE2EDuration="37.850862169s" podCreationTimestamp="2026-01-24 00:45:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-24 00:46:05.834909146 +0000 UTC m=+42.366895963" watchObservedRunningTime="2026-01-24 00:46:05.850862169 +0000 UTC m=+42.382848996" Jan 24 00:46:05.858271 containerd[1682]: 2026-01-24 00:46:05.690 [INFO][4375] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 24 00:46:05.858271 containerd[1682]: 2026-01-24 00:46:05.710 [INFO][4375] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4593--0--0--9--1308b066bf-k8s-calico--kube--controllers--58fdcd774c--w2drb-eth0 calico-kube-controllers-58fdcd774c- calico-system f546e732-cf0b-44c7-9678-ae1cb31a23a4 842 0 2026-01-24 00:45:41 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:58fdcd774c projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4593-0-0-9-1308b066bf calico-kube-controllers-58fdcd774c-w2drb eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali31e920bd568 [] [] }} ContainerID="57ae81604a42f3f1e02214b2c1764cb8a6284b575340f332465b0e5f3e99ca03" Namespace="calico-system" Pod="calico-kube-controllers-58fdcd774c-w2drb" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-calico--kube--controllers--58fdcd774c--w2drb-" Jan 24 00:46:05.858271 containerd[1682]: 2026-01-24 00:46:05.710 [INFO][4375] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="57ae81604a42f3f1e02214b2c1764cb8a6284b575340f332465b0e5f3e99ca03" Namespace="calico-system" Pod="calico-kube-controllers-58fdcd774c-w2drb" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-calico--kube--controllers--58fdcd774c--w2drb-eth0" Jan 24 00:46:05.858271 containerd[1682]: 2026-01-24 00:46:05.755 [INFO][4398] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="57ae81604a42f3f1e02214b2c1764cb8a6284b575340f332465b0e5f3e99ca03" HandleID="k8s-pod-network.57ae81604a42f3f1e02214b2c1764cb8a6284b575340f332465b0e5f3e99ca03" Workload="ci--4593--0--0--9--1308b066bf-k8s-calico--kube--controllers--58fdcd774c--w2drb-eth0" Jan 24 00:46:05.858271 containerd[1682]: 2026-01-24 00:46:05.755 [INFO][4398] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="57ae81604a42f3f1e02214b2c1764cb8a6284b575340f332465b0e5f3e99ca03" HandleID="k8s-pod-network.57ae81604a42f3f1e02214b2c1764cb8a6284b575340f332465b0e5f3e99ca03" Workload="ci--4593--0--0--9--1308b066bf-k8s-calico--kube--controllers--58fdcd774c--w2drb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f0f0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4593-0-0-9-1308b066bf", "pod":"calico-kube-controllers-58fdcd774c-w2drb", "timestamp":"2026-01-24 00:46:05.755331289 +0000 UTC"}, Hostname:"ci-4593-0-0-9-1308b066bf", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 24 00:46:05.858271 containerd[1682]: 2026-01-24 00:46:05.755 [INFO][4398] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 24 00:46:05.858271 containerd[1682]: 2026-01-24 00:46:05.755 [INFO][4398] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 24 00:46:05.858271 containerd[1682]: 2026-01-24 00:46:05.755 [INFO][4398] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4593-0-0-9-1308b066bf' Jan 24 00:46:05.858271 containerd[1682]: 2026-01-24 00:46:05.764 [INFO][4398] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.57ae81604a42f3f1e02214b2c1764cb8a6284b575340f332465b0e5f3e99ca03" host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:05.858271 containerd[1682]: 2026-01-24 00:46:05.769 [INFO][4398] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:05.858271 containerd[1682]: 2026-01-24 00:46:05.775 [INFO][4398] ipam/ipam.go 511: Trying affinity for 192.168.51.64/26 host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:05.858271 containerd[1682]: 2026-01-24 00:46:05.777 [INFO][4398] ipam/ipam.go 158: Attempting to load block cidr=192.168.51.64/26 host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:05.858271 containerd[1682]: 2026-01-24 00:46:05.780 [INFO][4398] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.51.64/26 host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:05.858271 containerd[1682]: 2026-01-24 00:46:05.780 [INFO][4398] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.51.64/26 handle="k8s-pod-network.57ae81604a42f3f1e02214b2c1764cb8a6284b575340f332465b0e5f3e99ca03" host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:05.858271 containerd[1682]: 2026-01-24 00:46:05.783 [INFO][4398] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.57ae81604a42f3f1e02214b2c1764cb8a6284b575340f332465b0e5f3e99ca03 Jan 24 00:46:05.858271 containerd[1682]: 2026-01-24 00:46:05.788 [INFO][4398] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.51.64/26 handle="k8s-pod-network.57ae81604a42f3f1e02214b2c1764cb8a6284b575340f332465b0e5f3e99ca03" host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:05.858271 containerd[1682]: 2026-01-24 00:46:05.794 [INFO][4398] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.51.67/26] block=192.168.51.64/26 handle="k8s-pod-network.57ae81604a42f3f1e02214b2c1764cb8a6284b575340f332465b0e5f3e99ca03" host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:05.858271 containerd[1682]: 2026-01-24 00:46:05.794 [INFO][4398] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.51.67/26] handle="k8s-pod-network.57ae81604a42f3f1e02214b2c1764cb8a6284b575340f332465b0e5f3e99ca03" host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:05.858271 containerd[1682]: 2026-01-24 00:46:05.794 [INFO][4398] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 24 00:46:05.858271 containerd[1682]: 2026-01-24 00:46:05.794 [INFO][4398] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.51.67/26] IPv6=[] ContainerID="57ae81604a42f3f1e02214b2c1764cb8a6284b575340f332465b0e5f3e99ca03" HandleID="k8s-pod-network.57ae81604a42f3f1e02214b2c1764cb8a6284b575340f332465b0e5f3e99ca03" Workload="ci--4593--0--0--9--1308b066bf-k8s-calico--kube--controllers--58fdcd774c--w2drb-eth0" Jan 24 00:46:05.858876 containerd[1682]: 2026-01-24 00:46:05.808 [INFO][4375] cni-plugin/k8s.go 418: Populated endpoint ContainerID="57ae81604a42f3f1e02214b2c1764cb8a6284b575340f332465b0e5f3e99ca03" Namespace="calico-system" Pod="calico-kube-controllers-58fdcd774c-w2drb" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-calico--kube--controllers--58fdcd774c--w2drb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--9--1308b066bf-k8s-calico--kube--controllers--58fdcd774c--w2drb-eth0", GenerateName:"calico-kube-controllers-58fdcd774c-", Namespace:"calico-system", SelfLink:"", UID:"f546e732-cf0b-44c7-9678-ae1cb31a23a4", ResourceVersion:"842", Generation:0, CreationTimestamp:time.Date(2026, time.January, 24, 0, 45, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"58fdcd774c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-9-1308b066bf", ContainerID:"", Pod:"calico-kube-controllers-58fdcd774c-w2drb", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.51.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali31e920bd568", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 24 00:46:05.858876 containerd[1682]: 2026-01-24 00:46:05.808 [INFO][4375] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.51.67/32] ContainerID="57ae81604a42f3f1e02214b2c1764cb8a6284b575340f332465b0e5f3e99ca03" Namespace="calico-system" Pod="calico-kube-controllers-58fdcd774c-w2drb" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-calico--kube--controllers--58fdcd774c--w2drb-eth0" Jan 24 00:46:05.858876 containerd[1682]: 2026-01-24 00:46:05.808 [INFO][4375] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali31e920bd568 ContainerID="57ae81604a42f3f1e02214b2c1764cb8a6284b575340f332465b0e5f3e99ca03" Namespace="calico-system" Pod="calico-kube-controllers-58fdcd774c-w2drb" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-calico--kube--controllers--58fdcd774c--w2drb-eth0" Jan 24 00:46:05.858876 containerd[1682]: 2026-01-24 00:46:05.836 [INFO][4375] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="57ae81604a42f3f1e02214b2c1764cb8a6284b575340f332465b0e5f3e99ca03" Namespace="calico-system" Pod="calico-kube-controllers-58fdcd774c-w2drb" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-calico--kube--controllers--58fdcd774c--w2drb-eth0" Jan 24 00:46:05.858876 containerd[1682]: 2026-01-24 00:46:05.842 [INFO][4375] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="57ae81604a42f3f1e02214b2c1764cb8a6284b575340f332465b0e5f3e99ca03" Namespace="calico-system" Pod="calico-kube-controllers-58fdcd774c-w2drb" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-calico--kube--controllers--58fdcd774c--w2drb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--9--1308b066bf-k8s-calico--kube--controllers--58fdcd774c--w2drb-eth0", GenerateName:"calico-kube-controllers-58fdcd774c-", Namespace:"calico-system", SelfLink:"", UID:"f546e732-cf0b-44c7-9678-ae1cb31a23a4", ResourceVersion:"842", Generation:0, CreationTimestamp:time.Date(2026, time.January, 24, 0, 45, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"58fdcd774c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-9-1308b066bf", ContainerID:"57ae81604a42f3f1e02214b2c1764cb8a6284b575340f332465b0e5f3e99ca03", Pod:"calico-kube-controllers-58fdcd774c-w2drb", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.51.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali31e920bd568", MAC:"ae:3e:b9:36:88:16", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 24 00:46:05.858876 containerd[1682]: 2026-01-24 00:46:05.856 [INFO][4375] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="57ae81604a42f3f1e02214b2c1764cb8a6284b575340f332465b0e5f3e99ca03" Namespace="calico-system" Pod="calico-kube-controllers-58fdcd774c-w2drb" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-calico--kube--controllers--58fdcd774c--w2drb-eth0" Jan 24 00:46:05.861000 audit[4416]: NETFILTER_CFG table=nat:120 family=2 entries=12 op=nft_register_rule pid=4416 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 24 00:46:05.861000 audit[4416]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7fffb20516d0 a2=0 a3=0 items=0 ppid=2972 pid=4416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:05.861000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 24 00:46:05.900586 containerd[1682]: time="2026-01-24T00:46:05.900477793Z" level=info msg="connecting to shim 57ae81604a42f3f1e02214b2c1764cb8a6284b575340f332465b0e5f3e99ca03" address="unix:///run/containerd/s/1d57c9086f3407a58b4004e69f54fac0a168e4acbd9221bac108e8291528529f" namespace=k8s.io protocol=ttrpc version=3 Jan 24 00:46:05.902000 audit[4426]: NETFILTER_CFG table=filter:121 family=2 entries=19 op=nft_register_rule pid=4426 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 24 00:46:05.902000 audit[4426]: SYSCALL arch=c000003e syscall=46 success=yes exit=5992 a0=3 a1=7ffdea4090f0 a2=0 a3=7ffdea4090dc items=0 ppid=2972 pid=4426 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:05.902000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 24 00:46:05.907000 audit[4426]: NETFILTER_CFG table=nat:122 family=2 entries=33 op=nft_register_chain pid=4426 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 24 00:46:05.907000 audit[4426]: SYSCALL arch=c000003e syscall=46 success=yes exit=13428 a0=3 a1=7ffdea4090f0 a2=0 a3=7ffdea4090dc items=0 ppid=2972 pid=4426 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:05.907000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 24 00:46:05.913874 systemd-networkd[1553]: cali3f283f077c2: Link UP Jan 24 00:46:05.916237 systemd-networkd[1553]: cali3f283f077c2: Gained carrier Jan 24 00:46:05.932030 containerd[1682]: 2026-01-24 00:46:05.695 [INFO][4378] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 24 00:46:05.932030 containerd[1682]: 2026-01-24 00:46:05.714 [INFO][4378] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4593--0--0--9--1308b066bf-k8s-calico--apiserver--7779cd58b4--xp2rb-eth0 calico-apiserver-7779cd58b4- calico-apiserver d83d59e3-6296-40d9-bb63-5a69b654ac0c 841 0 2026-01-24 00:45:37 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7779cd58b4 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4593-0-0-9-1308b066bf calico-apiserver-7779cd58b4-xp2rb eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali3f283f077c2 [] [] }} ContainerID="d945420ae355e786d6be4319ca8a034e86d3f86aff43fa4069a85961e9d9f147" Namespace="calico-apiserver" Pod="calico-apiserver-7779cd58b4-xp2rb" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-calico--apiserver--7779cd58b4--xp2rb-" Jan 24 00:46:05.932030 containerd[1682]: 2026-01-24 00:46:05.714 [INFO][4378] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d945420ae355e786d6be4319ca8a034e86d3f86aff43fa4069a85961e9d9f147" Namespace="calico-apiserver" Pod="calico-apiserver-7779cd58b4-xp2rb" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-calico--apiserver--7779cd58b4--xp2rb-eth0" Jan 24 00:46:05.932030 containerd[1682]: 2026-01-24 00:46:05.758 [INFO][4403] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d945420ae355e786d6be4319ca8a034e86d3f86aff43fa4069a85961e9d9f147" HandleID="k8s-pod-network.d945420ae355e786d6be4319ca8a034e86d3f86aff43fa4069a85961e9d9f147" Workload="ci--4593--0--0--9--1308b066bf-k8s-calico--apiserver--7779cd58b4--xp2rb-eth0" Jan 24 00:46:05.932030 containerd[1682]: 2026-01-24 00:46:05.758 [INFO][4403] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="d945420ae355e786d6be4319ca8a034e86d3f86aff43fa4069a85961e9d9f147" HandleID="k8s-pod-network.d945420ae355e786d6be4319ca8a034e86d3f86aff43fa4069a85961e9d9f147" Workload="ci--4593--0--0--9--1308b066bf-k8s-calico--apiserver--7779cd58b4--xp2rb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d58f0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4593-0-0-9-1308b066bf", "pod":"calico-apiserver-7779cd58b4-xp2rb", "timestamp":"2026-01-24 00:46:05.757994669 +0000 UTC"}, Hostname:"ci-4593-0-0-9-1308b066bf", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 24 00:46:05.932030 containerd[1682]: 2026-01-24 00:46:05.758 [INFO][4403] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 24 00:46:05.932030 containerd[1682]: 2026-01-24 00:46:05.794 [INFO][4403] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 24 00:46:05.932030 containerd[1682]: 2026-01-24 00:46:05.795 [INFO][4403] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4593-0-0-9-1308b066bf' Jan 24 00:46:05.932030 containerd[1682]: 2026-01-24 00:46:05.864 [INFO][4403] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d945420ae355e786d6be4319ca8a034e86d3f86aff43fa4069a85961e9d9f147" host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:05.932030 containerd[1682]: 2026-01-24 00:46:05.870 [INFO][4403] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:05.932030 containerd[1682]: 2026-01-24 00:46:05.878 [INFO][4403] ipam/ipam.go 511: Trying affinity for 192.168.51.64/26 host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:05.932030 containerd[1682]: 2026-01-24 00:46:05.881 [INFO][4403] ipam/ipam.go 158: Attempting to load block cidr=192.168.51.64/26 host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:05.932030 containerd[1682]: 2026-01-24 00:46:05.883 [INFO][4403] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.51.64/26 host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:05.932030 containerd[1682]: 2026-01-24 00:46:05.883 [INFO][4403] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.51.64/26 handle="k8s-pod-network.d945420ae355e786d6be4319ca8a034e86d3f86aff43fa4069a85961e9d9f147" host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:05.932030 containerd[1682]: 2026-01-24 00:46:05.885 [INFO][4403] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.d945420ae355e786d6be4319ca8a034e86d3f86aff43fa4069a85961e9d9f147 Jan 24 00:46:05.932030 containerd[1682]: 2026-01-24 00:46:05.895 [INFO][4403] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.51.64/26 handle="k8s-pod-network.d945420ae355e786d6be4319ca8a034e86d3f86aff43fa4069a85961e9d9f147" host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:05.932030 containerd[1682]: 2026-01-24 00:46:05.902 [INFO][4403] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.51.68/26] block=192.168.51.64/26 handle="k8s-pod-network.d945420ae355e786d6be4319ca8a034e86d3f86aff43fa4069a85961e9d9f147" host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:05.932030 containerd[1682]: 2026-01-24 00:46:05.903 [INFO][4403] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.51.68/26] handle="k8s-pod-network.d945420ae355e786d6be4319ca8a034e86d3f86aff43fa4069a85961e9d9f147" host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:05.932030 containerd[1682]: 2026-01-24 00:46:05.903 [INFO][4403] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 24 00:46:05.932030 containerd[1682]: 2026-01-24 00:46:05.903 [INFO][4403] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.51.68/26] IPv6=[] ContainerID="d945420ae355e786d6be4319ca8a034e86d3f86aff43fa4069a85961e9d9f147" HandleID="k8s-pod-network.d945420ae355e786d6be4319ca8a034e86d3f86aff43fa4069a85961e9d9f147" Workload="ci--4593--0--0--9--1308b066bf-k8s-calico--apiserver--7779cd58b4--xp2rb-eth0" Jan 24 00:46:05.932460 containerd[1682]: 2026-01-24 00:46:05.907 [INFO][4378] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d945420ae355e786d6be4319ca8a034e86d3f86aff43fa4069a85961e9d9f147" Namespace="calico-apiserver" Pod="calico-apiserver-7779cd58b4-xp2rb" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-calico--apiserver--7779cd58b4--xp2rb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--9--1308b066bf-k8s-calico--apiserver--7779cd58b4--xp2rb-eth0", GenerateName:"calico-apiserver-7779cd58b4-", Namespace:"calico-apiserver", SelfLink:"", UID:"d83d59e3-6296-40d9-bb63-5a69b654ac0c", ResourceVersion:"841", Generation:0, CreationTimestamp:time.Date(2026, time.January, 24, 0, 45, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7779cd58b4", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-9-1308b066bf", ContainerID:"", Pod:"calico-apiserver-7779cd58b4-xp2rb", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.51.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali3f283f077c2", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 24 00:46:05.932460 containerd[1682]: 2026-01-24 00:46:05.907 [INFO][4378] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.51.68/32] ContainerID="d945420ae355e786d6be4319ca8a034e86d3f86aff43fa4069a85961e9d9f147" Namespace="calico-apiserver" Pod="calico-apiserver-7779cd58b4-xp2rb" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-calico--apiserver--7779cd58b4--xp2rb-eth0" Jan 24 00:46:05.932460 containerd[1682]: 2026-01-24 00:46:05.907 [INFO][4378] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3f283f077c2 ContainerID="d945420ae355e786d6be4319ca8a034e86d3f86aff43fa4069a85961e9d9f147" Namespace="calico-apiserver" Pod="calico-apiserver-7779cd58b4-xp2rb" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-calico--apiserver--7779cd58b4--xp2rb-eth0" Jan 24 00:46:05.932460 containerd[1682]: 2026-01-24 00:46:05.912 [INFO][4378] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d945420ae355e786d6be4319ca8a034e86d3f86aff43fa4069a85961e9d9f147" Namespace="calico-apiserver" Pod="calico-apiserver-7779cd58b4-xp2rb" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-calico--apiserver--7779cd58b4--xp2rb-eth0" Jan 24 00:46:05.932460 containerd[1682]: 2026-01-24 00:46:05.913 [INFO][4378] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d945420ae355e786d6be4319ca8a034e86d3f86aff43fa4069a85961e9d9f147" Namespace="calico-apiserver" Pod="calico-apiserver-7779cd58b4-xp2rb" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-calico--apiserver--7779cd58b4--xp2rb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--9--1308b066bf-k8s-calico--apiserver--7779cd58b4--xp2rb-eth0", GenerateName:"calico-apiserver-7779cd58b4-", Namespace:"calico-apiserver", SelfLink:"", UID:"d83d59e3-6296-40d9-bb63-5a69b654ac0c", ResourceVersion:"841", Generation:0, CreationTimestamp:time.Date(2026, time.January, 24, 0, 45, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7779cd58b4", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-9-1308b066bf", ContainerID:"d945420ae355e786d6be4319ca8a034e86d3f86aff43fa4069a85961e9d9f147", Pod:"calico-apiserver-7779cd58b4-xp2rb", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.51.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali3f283f077c2", MAC:"ae:56:1e:79:5b:02", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 24 00:46:05.932460 containerd[1682]: 2026-01-24 00:46:05.928 [INFO][4378] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d945420ae355e786d6be4319ca8a034e86d3f86aff43fa4069a85961e9d9f147" Namespace="calico-apiserver" Pod="calico-apiserver-7779cd58b4-xp2rb" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-calico--apiserver--7779cd58b4--xp2rb-eth0" Jan 24 00:46:05.932299 systemd[1]: Started cri-containerd-57ae81604a42f3f1e02214b2c1764cb8a6284b575340f332465b0e5f3e99ca03.scope - libcontainer container 57ae81604a42f3f1e02214b2c1764cb8a6284b575340f332465b0e5f3e99ca03. Jan 24 00:46:05.944000 audit: BPF prog-id=190 op=LOAD Jan 24 00:46:05.945000 audit: BPF prog-id=191 op=LOAD Jan 24 00:46:05.945000 audit[4446]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=4434 pid=4446 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:05.945000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3537616538313630346134326633663165303232313462326331373634 Jan 24 00:46:05.945000 audit: BPF prog-id=191 op=UNLOAD Jan 24 00:46:05.945000 audit[4446]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4434 pid=4446 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:05.945000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3537616538313630346134326633663165303232313462326331373634 Jan 24 00:46:05.945000 audit: BPF prog-id=192 op=LOAD Jan 24 00:46:05.945000 audit[4446]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=4434 pid=4446 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:05.945000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3537616538313630346134326633663165303232313462326331373634 Jan 24 00:46:05.945000 audit: BPF prog-id=193 op=LOAD Jan 24 00:46:05.945000 audit[4446]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=4434 pid=4446 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:05.945000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3537616538313630346134326633663165303232313462326331373634 Jan 24 00:46:05.945000 audit: BPF prog-id=193 op=UNLOAD Jan 24 00:46:05.945000 audit[4446]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4434 pid=4446 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:05.945000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3537616538313630346134326633663165303232313462326331373634 Jan 24 00:46:05.945000 audit: BPF prog-id=192 op=UNLOAD Jan 24 00:46:05.945000 audit[4446]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4434 pid=4446 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:05.945000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3537616538313630346134326633663165303232313462326331373634 Jan 24 00:46:05.945000 audit: BPF prog-id=194 op=LOAD Jan 24 00:46:05.945000 audit[4446]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=4434 pid=4446 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:05.945000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3537616538313630346134326633663165303232313462326331373634 Jan 24 00:46:05.958117 containerd[1682]: time="2026-01-24T00:46:05.958085489Z" level=info msg="connecting to shim d945420ae355e786d6be4319ca8a034e86d3f86aff43fa4069a85961e9d9f147" address="unix:///run/containerd/s/acbfe298927e2b50c26f9f316855dbae0424994cb694ec718ce9e5ae0196a76f" namespace=k8s.io protocol=ttrpc version=3 Jan 24 00:46:05.983233 systemd[1]: Started cri-containerd-d945420ae355e786d6be4319ca8a034e86d3f86aff43fa4069a85961e9d9f147.scope - libcontainer container d945420ae355e786d6be4319ca8a034e86d3f86aff43fa4069a85961e9d9f147. Jan 24 00:46:05.998017 containerd[1682]: time="2026-01-24T00:46:05.997947722Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-58fdcd774c-w2drb,Uid:f546e732-cf0b-44c7-9678-ae1cb31a23a4,Namespace:calico-system,Attempt:0,} returns sandbox id \"57ae81604a42f3f1e02214b2c1764cb8a6284b575340f332465b0e5f3e99ca03\"" Jan 24 00:46:06.000632 containerd[1682]: time="2026-01-24T00:46:06.000596383Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 24 00:46:06.001000 audit: BPF prog-id=195 op=LOAD Jan 24 00:46:06.001000 audit: BPF prog-id=196 op=LOAD Jan 24 00:46:06.001000 audit[4494]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000128238 a2=98 a3=0 items=0 ppid=4483 pid=4494 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:06.001000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6439343534323061653335356537383664366265343331396361386130 Jan 24 00:46:06.001000 audit: BPF prog-id=196 op=UNLOAD Jan 24 00:46:06.001000 audit[4494]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4483 pid=4494 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:06.001000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6439343534323061653335356537383664366265343331396361386130 Jan 24 00:46:06.002000 audit: BPF prog-id=197 op=LOAD Jan 24 00:46:06.002000 audit[4494]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000128488 a2=98 a3=0 items=0 ppid=4483 pid=4494 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:06.002000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6439343534323061653335356537383664366265343331396361386130 Jan 24 00:46:06.002000 audit: BPF prog-id=198 op=LOAD Jan 24 00:46:06.002000 audit[4494]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000128218 a2=98 a3=0 items=0 ppid=4483 pid=4494 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:06.002000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6439343534323061653335356537383664366265343331396361386130 Jan 24 00:46:06.002000 audit: BPF prog-id=198 op=UNLOAD Jan 24 00:46:06.002000 audit[4494]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4483 pid=4494 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:06.002000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6439343534323061653335356537383664366265343331396361386130 Jan 24 00:46:06.002000 audit: BPF prog-id=197 op=UNLOAD Jan 24 00:46:06.002000 audit[4494]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4483 pid=4494 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:06.002000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6439343534323061653335356537383664366265343331396361386130 Jan 24 00:46:06.002000 audit: BPF prog-id=199 op=LOAD Jan 24 00:46:06.002000 audit[4494]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001286e8 a2=98 a3=0 items=0 ppid=4483 pid=4494 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:06.002000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6439343534323061653335356537383664366265343331396361386130 Jan 24 00:46:06.034390 containerd[1682]: time="2026-01-24T00:46:06.034339168Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7779cd58b4-xp2rb,Uid:d83d59e3-6296-40d9-bb63-5a69b654ac0c,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"d945420ae355e786d6be4319ca8a034e86d3f86aff43fa4069a85961e9d9f147\"" Jan 24 00:46:06.206018 systemd-networkd[1553]: cali0ad53b444da: Gained IPv6LL Jan 24 00:46:06.438387 containerd[1682]: time="2026-01-24T00:46:06.438297726Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 24 00:46:06.440620 containerd[1682]: time="2026-01-24T00:46:06.440555766Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 24 00:46:06.440620 containerd[1682]: time="2026-01-24T00:46:06.440691816Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 24 00:46:06.441271 kubelet[2863]: E0124 00:46:06.441173 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 24 00:46:06.441271 kubelet[2863]: E0124 00:46:06.441254 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 24 00:46:06.442170 kubelet[2863]: E0124 00:46:06.441588 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f5fsn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-58fdcd774c-w2drb_calico-system(f546e732-cf0b-44c7-9678-ae1cb31a23a4): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 24 00:46:06.442403 containerd[1682]: time="2026-01-24T00:46:06.441723546Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 24 00:46:06.443023 kubelet[2863]: E0124 00:46:06.442813 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:46:06.616612 containerd[1682]: time="2026-01-24T00:46:06.616323581Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-njf24,Uid:12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc,Namespace:calico-system,Attempt:0,}" Jan 24 00:46:06.616612 containerd[1682]: time="2026-01-24T00:46:06.616444401Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-qhldj,Uid:d0cc0ca8-4b85-478b-b9c2-c61b42d93c89,Namespace:calico-system,Attempt:0,}" Jan 24 00:46:06.616612 containerd[1682]: time="2026-01-24T00:46:06.616601602Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-dfsrt,Uid:0a9fe640-bc48-43ab-ab91-0adb30a0cc77,Namespace:kube-system,Attempt:0,}" Jan 24 00:46:06.793620 systemd-networkd[1553]: cali11e6e378321: Link UP Jan 24 00:46:06.794240 systemd-networkd[1553]: cali11e6e378321: Gained carrier Jan 24 00:46:06.805980 containerd[1682]: 2026-01-24 00:46:06.701 [INFO][4546] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 24 00:46:06.805980 containerd[1682]: 2026-01-24 00:46:06.722 [INFO][4546] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4593--0--0--9--1308b066bf-k8s-csi--node--driver--njf24-eth0 csi-node-driver- calico-system 12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc 754 0 2026-01-24 00:45:41 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:857b56db8f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4593-0-0-9-1308b066bf csi-node-driver-njf24 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali11e6e378321 [] [] }} ContainerID="69e06ae8bc265e1fb1f3418504deaeb8c4f95ec1386d6bd13e92f3c9c3c75c9e" Namespace="calico-system" Pod="csi-node-driver-njf24" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-csi--node--driver--njf24-" Jan 24 00:46:06.805980 containerd[1682]: 2026-01-24 00:46:06.723 [INFO][4546] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="69e06ae8bc265e1fb1f3418504deaeb8c4f95ec1386d6bd13e92f3c9c3c75c9e" Namespace="calico-system" Pod="csi-node-driver-njf24" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-csi--node--driver--njf24-eth0" Jan 24 00:46:06.805980 containerd[1682]: 2026-01-24 00:46:06.758 [INFO][4588] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="69e06ae8bc265e1fb1f3418504deaeb8c4f95ec1386d6bd13e92f3c9c3c75c9e" HandleID="k8s-pod-network.69e06ae8bc265e1fb1f3418504deaeb8c4f95ec1386d6bd13e92f3c9c3c75c9e" Workload="ci--4593--0--0--9--1308b066bf-k8s-csi--node--driver--njf24-eth0" Jan 24 00:46:06.805980 containerd[1682]: 2026-01-24 00:46:06.758 [INFO][4588] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="69e06ae8bc265e1fb1f3418504deaeb8c4f95ec1386d6bd13e92f3c9c3c75c9e" HandleID="k8s-pod-network.69e06ae8bc265e1fb1f3418504deaeb8c4f95ec1386d6bd13e92f3c9c3c75c9e" Workload="ci--4593--0--0--9--1308b066bf-k8s-csi--node--driver--njf24-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5680), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4593-0-0-9-1308b066bf", "pod":"csi-node-driver-njf24", "timestamp":"2026-01-24 00:46:06.758672842 +0000 UTC"}, Hostname:"ci-4593-0-0-9-1308b066bf", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 24 00:46:06.805980 containerd[1682]: 2026-01-24 00:46:06.758 [INFO][4588] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 24 00:46:06.805980 containerd[1682]: 2026-01-24 00:46:06.758 [INFO][4588] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 24 00:46:06.805980 containerd[1682]: 2026-01-24 00:46:06.758 [INFO][4588] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4593-0-0-9-1308b066bf' Jan 24 00:46:06.805980 containerd[1682]: 2026-01-24 00:46:06.764 [INFO][4588] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.69e06ae8bc265e1fb1f3418504deaeb8c4f95ec1386d6bd13e92f3c9c3c75c9e" host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:06.805980 containerd[1682]: 2026-01-24 00:46:06.768 [INFO][4588] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:06.805980 containerd[1682]: 2026-01-24 00:46:06.773 [INFO][4588] ipam/ipam.go 511: Trying affinity for 192.168.51.64/26 host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:06.805980 containerd[1682]: 2026-01-24 00:46:06.774 [INFO][4588] ipam/ipam.go 158: Attempting to load block cidr=192.168.51.64/26 host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:06.805980 containerd[1682]: 2026-01-24 00:46:06.775 [INFO][4588] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.51.64/26 host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:06.805980 containerd[1682]: 2026-01-24 00:46:06.775 [INFO][4588] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.51.64/26 handle="k8s-pod-network.69e06ae8bc265e1fb1f3418504deaeb8c4f95ec1386d6bd13e92f3c9c3c75c9e" host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:06.805980 containerd[1682]: 2026-01-24 00:46:06.777 [INFO][4588] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.69e06ae8bc265e1fb1f3418504deaeb8c4f95ec1386d6bd13e92f3c9c3c75c9e Jan 24 00:46:06.805980 containerd[1682]: 2026-01-24 00:46:06.780 [INFO][4588] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.51.64/26 handle="k8s-pod-network.69e06ae8bc265e1fb1f3418504deaeb8c4f95ec1386d6bd13e92f3c9c3c75c9e" host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:06.805980 containerd[1682]: 2026-01-24 00:46:06.787 [INFO][4588] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.51.69/26] block=192.168.51.64/26 handle="k8s-pod-network.69e06ae8bc265e1fb1f3418504deaeb8c4f95ec1386d6bd13e92f3c9c3c75c9e" host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:06.805980 containerd[1682]: 2026-01-24 00:46:06.787 [INFO][4588] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.51.69/26] handle="k8s-pod-network.69e06ae8bc265e1fb1f3418504deaeb8c4f95ec1386d6bd13e92f3c9c3c75c9e" host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:06.805980 containerd[1682]: 2026-01-24 00:46:06.787 [INFO][4588] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 24 00:46:06.805980 containerd[1682]: 2026-01-24 00:46:06.787 [INFO][4588] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.51.69/26] IPv6=[] ContainerID="69e06ae8bc265e1fb1f3418504deaeb8c4f95ec1386d6bd13e92f3c9c3c75c9e" HandleID="k8s-pod-network.69e06ae8bc265e1fb1f3418504deaeb8c4f95ec1386d6bd13e92f3c9c3c75c9e" Workload="ci--4593--0--0--9--1308b066bf-k8s-csi--node--driver--njf24-eth0" Jan 24 00:46:06.807347 containerd[1682]: 2026-01-24 00:46:06.790 [INFO][4546] cni-plugin/k8s.go 418: Populated endpoint ContainerID="69e06ae8bc265e1fb1f3418504deaeb8c4f95ec1386d6bd13e92f3c9c3c75c9e" Namespace="calico-system" Pod="csi-node-driver-njf24" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-csi--node--driver--njf24-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--9--1308b066bf-k8s-csi--node--driver--njf24-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc", ResourceVersion:"754", Generation:0, CreationTimestamp:time.Date(2026, time.January, 24, 0, 45, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-9-1308b066bf", ContainerID:"", Pod:"csi-node-driver-njf24", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.51.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali11e6e378321", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 24 00:46:06.807347 containerd[1682]: 2026-01-24 00:46:06.790 [INFO][4546] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.51.69/32] ContainerID="69e06ae8bc265e1fb1f3418504deaeb8c4f95ec1386d6bd13e92f3c9c3c75c9e" Namespace="calico-system" Pod="csi-node-driver-njf24" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-csi--node--driver--njf24-eth0" Jan 24 00:46:06.807347 containerd[1682]: 2026-01-24 00:46:06.790 [INFO][4546] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali11e6e378321 ContainerID="69e06ae8bc265e1fb1f3418504deaeb8c4f95ec1386d6bd13e92f3c9c3c75c9e" Namespace="calico-system" Pod="csi-node-driver-njf24" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-csi--node--driver--njf24-eth0" Jan 24 00:46:06.807347 containerd[1682]: 2026-01-24 00:46:06.793 [INFO][4546] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="69e06ae8bc265e1fb1f3418504deaeb8c4f95ec1386d6bd13e92f3c9c3c75c9e" Namespace="calico-system" Pod="csi-node-driver-njf24" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-csi--node--driver--njf24-eth0" Jan 24 00:46:06.807347 containerd[1682]: 2026-01-24 00:46:06.794 [INFO][4546] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="69e06ae8bc265e1fb1f3418504deaeb8c4f95ec1386d6bd13e92f3c9c3c75c9e" Namespace="calico-system" Pod="csi-node-driver-njf24" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-csi--node--driver--njf24-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--9--1308b066bf-k8s-csi--node--driver--njf24-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc", ResourceVersion:"754", Generation:0, CreationTimestamp:time.Date(2026, time.January, 24, 0, 45, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-9-1308b066bf", ContainerID:"69e06ae8bc265e1fb1f3418504deaeb8c4f95ec1386d6bd13e92f3c9c3c75c9e", Pod:"csi-node-driver-njf24", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.51.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali11e6e378321", MAC:"96:1d:33:8c:35:68", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 24 00:46:06.807347 containerd[1682]: 2026-01-24 00:46:06.803 [INFO][4546] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="69e06ae8bc265e1fb1f3418504deaeb8c4f95ec1386d6bd13e92f3c9c3c75c9e" Namespace="calico-system" Pod="csi-node-driver-njf24" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-csi--node--driver--njf24-eth0" Jan 24 00:46:06.811993 kubelet[2863]: E0124 00:46:06.811949 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:46:06.831191 containerd[1682]: time="2026-01-24T00:46:06.831111632Z" level=info msg="connecting to shim 69e06ae8bc265e1fb1f3418504deaeb8c4f95ec1386d6bd13e92f3c9c3c75c9e" address="unix:///run/containerd/s/01981f2e316eb9eb185fc6c9029f7a3c736ec130dc45f0b86d4fd58cb6d97717" namespace=k8s.io protocol=ttrpc version=3 Jan 24 00:46:06.860337 systemd[1]: Started cri-containerd-69e06ae8bc265e1fb1f3418504deaeb8c4f95ec1386d6bd13e92f3c9c3c75c9e.scope - libcontainer container 69e06ae8bc265e1fb1f3418504deaeb8c4f95ec1386d6bd13e92f3c9c3c75c9e. Jan 24 00:46:06.870275 containerd[1682]: time="2026-01-24T00:46:06.869923518Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 24 00:46:06.871141 containerd[1682]: time="2026-01-24T00:46:06.871052118Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 24 00:46:06.871188 containerd[1682]: time="2026-01-24T00:46:06.871161228Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 24 00:46:06.871513 kubelet[2863]: E0124 00:46:06.871449 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 24 00:46:06.872301 kubelet[2863]: E0124 00:46:06.871607 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 24 00:46:06.872301 kubelet[2863]: E0124 00:46:06.871936 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tm5dq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-7779cd58b4-xp2rb_calico-apiserver(d83d59e3-6296-40d9-bb63-5a69b654ac0c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 24 00:46:06.873792 kubelet[2863]: E0124 00:46:06.873662 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" podUID="d83d59e3-6296-40d9-bb63-5a69b654ac0c" Jan 24 00:46:06.876000 audit: BPF prog-id=200 op=LOAD Jan 24 00:46:06.878149 kernel: kauditd_printk_skb: 133 callbacks suppressed Jan 24 00:46:06.878188 kernel: audit: type=1334 audit(1769215566.876:625): prog-id=200 op=LOAD Jan 24 00:46:06.879000 audit: BPF prog-id=201 op=LOAD Jan 24 00:46:06.883090 kernel: audit: type=1334 audit(1769215566.879:626): prog-id=201 op=LOAD Jan 24 00:46:06.879000 audit[4630]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000106238 a2=98 a3=0 items=0 ppid=4619 pid=4630 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:06.889152 kernel: audit: type=1300 audit(1769215566.879:626): arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000106238 a2=98 a3=0 items=0 ppid=4619 pid=4630 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:06.879000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3639653036616538626332363565316662316633343138353034646561 Jan 24 00:46:06.895145 kernel: audit: type=1327 audit(1769215566.879:626): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3639653036616538626332363565316662316633343138353034646561 Jan 24 00:46:06.879000 audit: BPF prog-id=201 op=UNLOAD Jan 24 00:46:06.879000 audit[4630]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4619 pid=4630 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:06.898408 kernel: audit: type=1334 audit(1769215566.879:627): prog-id=201 op=UNLOAD Jan 24 00:46:06.898435 kernel: audit: type=1300 audit(1769215566.879:627): arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4619 pid=4630 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:06.879000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3639653036616538626332363565316662316633343138353034646561 Jan 24 00:46:06.903831 kernel: audit: type=1327 audit(1769215566.879:627): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3639653036616538626332363565316662316633343138353034646561 Jan 24 00:46:06.879000 audit: BPF prog-id=202 op=LOAD Jan 24 00:46:06.879000 audit[4630]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000106488 a2=98 a3=0 items=0 ppid=4619 pid=4630 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:06.910942 kernel: audit: type=1334 audit(1769215566.879:628): prog-id=202 op=LOAD Jan 24 00:46:06.910976 kernel: audit: type=1300 audit(1769215566.879:628): arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000106488 a2=98 a3=0 items=0 ppid=4619 pid=4630 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:06.879000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3639653036616538626332363565316662316633343138353034646561 Jan 24 00:46:06.916561 kernel: audit: type=1327 audit(1769215566.879:628): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3639653036616538626332363565316662316633343138353034646561 Jan 24 00:46:06.880000 audit: BPF prog-id=203 op=LOAD Jan 24 00:46:06.880000 audit[4630]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000106218 a2=98 a3=0 items=0 ppid=4619 pid=4630 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:06.880000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3639653036616538626332363565316662316633343138353034646561 Jan 24 00:46:06.880000 audit: BPF prog-id=203 op=UNLOAD Jan 24 00:46:06.880000 audit[4630]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4619 pid=4630 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:06.880000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3639653036616538626332363565316662316633343138353034646561 Jan 24 00:46:06.880000 audit: BPF prog-id=202 op=UNLOAD Jan 24 00:46:06.880000 audit[4630]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4619 pid=4630 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:06.880000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3639653036616538626332363565316662316633343138353034646561 Jan 24 00:46:06.880000 audit: BPF prog-id=204 op=LOAD Jan 24 00:46:06.880000 audit[4630]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001066e8 a2=98 a3=0 items=0 ppid=4619 pid=4630 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:06.880000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3639653036616538626332363565316662316633343138353034646561 Jan 24 00:46:06.925153 systemd-networkd[1553]: calib96b4d678f9: Link UP Jan 24 00:46:06.925329 systemd-networkd[1553]: calib96b4d678f9: Gained carrier Jan 24 00:46:06.938016 containerd[1682]: time="2026-01-24T00:46:06.937735137Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-njf24,Uid:12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc,Namespace:calico-system,Attempt:0,} returns sandbox id \"69e06ae8bc265e1fb1f3418504deaeb8c4f95ec1386d6bd13e92f3c9c3c75c9e\"" Jan 24 00:46:06.941208 containerd[1682]: time="2026-01-24T00:46:06.941177348Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 24 00:46:06.944574 containerd[1682]: 2026-01-24 00:46:06.714 [INFO][4559] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 24 00:46:06.944574 containerd[1682]: 2026-01-24 00:46:06.725 [INFO][4559] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4593--0--0--9--1308b066bf-k8s-coredns--674b8bbfcf--dfsrt-eth0 coredns-674b8bbfcf- kube-system 0a9fe640-bc48-43ab-ab91-0adb30a0cc77 843 0 2026-01-24 00:45:28 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4593-0-0-9-1308b066bf coredns-674b8bbfcf-dfsrt eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calib96b4d678f9 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="c12906afa58629820f0afacb4581293e9143663acd69829fe97374dfe4ffcad1" Namespace="kube-system" Pod="coredns-674b8bbfcf-dfsrt" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-coredns--674b8bbfcf--dfsrt-" Jan 24 00:46:06.944574 containerd[1682]: 2026-01-24 00:46:06.725 [INFO][4559] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c12906afa58629820f0afacb4581293e9143663acd69829fe97374dfe4ffcad1" Namespace="kube-system" Pod="coredns-674b8bbfcf-dfsrt" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-coredns--674b8bbfcf--dfsrt-eth0" Jan 24 00:46:06.944574 containerd[1682]: 2026-01-24 00:46:06.765 [INFO][4585] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c12906afa58629820f0afacb4581293e9143663acd69829fe97374dfe4ffcad1" HandleID="k8s-pod-network.c12906afa58629820f0afacb4581293e9143663acd69829fe97374dfe4ffcad1" Workload="ci--4593--0--0--9--1308b066bf-k8s-coredns--674b8bbfcf--dfsrt-eth0" Jan 24 00:46:06.944574 containerd[1682]: 2026-01-24 00:46:06.765 [INFO][4585] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="c12906afa58629820f0afacb4581293e9143663acd69829fe97374dfe4ffcad1" HandleID="k8s-pod-network.c12906afa58629820f0afacb4581293e9143663acd69829fe97374dfe4ffcad1" Workload="ci--4593--0--0--9--1308b066bf-k8s-coredns--674b8bbfcf--dfsrt-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000332110), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4593-0-0-9-1308b066bf", "pod":"coredns-674b8bbfcf-dfsrt", "timestamp":"2026-01-24 00:46:06.765159562 +0000 UTC"}, Hostname:"ci-4593-0-0-9-1308b066bf", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 24 00:46:06.944574 containerd[1682]: 2026-01-24 00:46:06.765 [INFO][4585] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 24 00:46:06.944574 containerd[1682]: 2026-01-24 00:46:06.787 [INFO][4585] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 24 00:46:06.944574 containerd[1682]: 2026-01-24 00:46:06.787 [INFO][4585] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4593-0-0-9-1308b066bf' Jan 24 00:46:06.944574 containerd[1682]: 2026-01-24 00:46:06.865 [INFO][4585] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c12906afa58629820f0afacb4581293e9143663acd69829fe97374dfe4ffcad1" host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:06.944574 containerd[1682]: 2026-01-24 00:46:06.872 [INFO][4585] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:06.944574 containerd[1682]: 2026-01-24 00:46:06.880 [INFO][4585] ipam/ipam.go 511: Trying affinity for 192.168.51.64/26 host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:06.944574 containerd[1682]: 2026-01-24 00:46:06.883 [INFO][4585] ipam/ipam.go 158: Attempting to load block cidr=192.168.51.64/26 host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:06.944574 containerd[1682]: 2026-01-24 00:46:06.889 [INFO][4585] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.51.64/26 host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:06.944574 containerd[1682]: 2026-01-24 00:46:06.889 [INFO][4585] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.51.64/26 handle="k8s-pod-network.c12906afa58629820f0afacb4581293e9143663acd69829fe97374dfe4ffcad1" host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:06.944574 containerd[1682]: 2026-01-24 00:46:06.895 [INFO][4585] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.c12906afa58629820f0afacb4581293e9143663acd69829fe97374dfe4ffcad1 Jan 24 00:46:06.944574 containerd[1682]: 2026-01-24 00:46:06.900 [INFO][4585] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.51.64/26 handle="k8s-pod-network.c12906afa58629820f0afacb4581293e9143663acd69829fe97374dfe4ffcad1" host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:06.944574 containerd[1682]: 2026-01-24 00:46:06.911 [INFO][4585] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.51.70/26] block=192.168.51.64/26 handle="k8s-pod-network.c12906afa58629820f0afacb4581293e9143663acd69829fe97374dfe4ffcad1" host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:06.944574 containerd[1682]: 2026-01-24 00:46:06.911 [INFO][4585] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.51.70/26] handle="k8s-pod-network.c12906afa58629820f0afacb4581293e9143663acd69829fe97374dfe4ffcad1" host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:06.944574 containerd[1682]: 2026-01-24 00:46:06.911 [INFO][4585] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 24 00:46:06.944574 containerd[1682]: 2026-01-24 00:46:06.911 [INFO][4585] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.51.70/26] IPv6=[] ContainerID="c12906afa58629820f0afacb4581293e9143663acd69829fe97374dfe4ffcad1" HandleID="k8s-pod-network.c12906afa58629820f0afacb4581293e9143663acd69829fe97374dfe4ffcad1" Workload="ci--4593--0--0--9--1308b066bf-k8s-coredns--674b8bbfcf--dfsrt-eth0" Jan 24 00:46:06.944992 containerd[1682]: 2026-01-24 00:46:06.922 [INFO][4559] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c12906afa58629820f0afacb4581293e9143663acd69829fe97374dfe4ffcad1" Namespace="kube-system" Pod="coredns-674b8bbfcf-dfsrt" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-coredns--674b8bbfcf--dfsrt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--9--1308b066bf-k8s-coredns--674b8bbfcf--dfsrt-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"0a9fe640-bc48-43ab-ab91-0adb30a0cc77", ResourceVersion:"843", Generation:0, CreationTimestamp:time.Date(2026, time.January, 24, 0, 45, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-9-1308b066bf", ContainerID:"", Pod:"coredns-674b8bbfcf-dfsrt", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.51.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib96b4d678f9", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 24 00:46:06.944992 containerd[1682]: 2026-01-24 00:46:06.922 [INFO][4559] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.51.70/32] ContainerID="c12906afa58629820f0afacb4581293e9143663acd69829fe97374dfe4ffcad1" Namespace="kube-system" Pod="coredns-674b8bbfcf-dfsrt" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-coredns--674b8bbfcf--dfsrt-eth0" Jan 24 00:46:06.944992 containerd[1682]: 2026-01-24 00:46:06.922 [INFO][4559] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib96b4d678f9 ContainerID="c12906afa58629820f0afacb4581293e9143663acd69829fe97374dfe4ffcad1" Namespace="kube-system" Pod="coredns-674b8bbfcf-dfsrt" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-coredns--674b8bbfcf--dfsrt-eth0" Jan 24 00:46:06.944992 containerd[1682]: 2026-01-24 00:46:06.924 [INFO][4559] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c12906afa58629820f0afacb4581293e9143663acd69829fe97374dfe4ffcad1" Namespace="kube-system" Pod="coredns-674b8bbfcf-dfsrt" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-coredns--674b8bbfcf--dfsrt-eth0" Jan 24 00:46:06.944992 containerd[1682]: 2026-01-24 00:46:06.925 [INFO][4559] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c12906afa58629820f0afacb4581293e9143663acd69829fe97374dfe4ffcad1" Namespace="kube-system" Pod="coredns-674b8bbfcf-dfsrt" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-coredns--674b8bbfcf--dfsrt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--9--1308b066bf-k8s-coredns--674b8bbfcf--dfsrt-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"0a9fe640-bc48-43ab-ab91-0adb30a0cc77", ResourceVersion:"843", Generation:0, CreationTimestamp:time.Date(2026, time.January, 24, 0, 45, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-9-1308b066bf", ContainerID:"c12906afa58629820f0afacb4581293e9143663acd69829fe97374dfe4ffcad1", Pod:"coredns-674b8bbfcf-dfsrt", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.51.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib96b4d678f9", MAC:"ca:5e:6e:60:5a:bf", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 24 00:46:06.944992 containerd[1682]: 2026-01-24 00:46:06.940 [INFO][4559] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c12906afa58629820f0afacb4581293e9143663acd69829fe97374dfe4ffcad1" Namespace="kube-system" Pod="coredns-674b8bbfcf-dfsrt" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-coredns--674b8bbfcf--dfsrt-eth0" Jan 24 00:46:06.965270 containerd[1682]: time="2026-01-24T00:46:06.965195521Z" level=info msg="connecting to shim c12906afa58629820f0afacb4581293e9143663acd69829fe97374dfe4ffcad1" address="unix:///run/containerd/s/627808e63a0aa03586f9b5969acb2dacc53e1909f35a200de743bdd622b18196" namespace=k8s.io protocol=ttrpc version=3 Jan 24 00:46:06.994202 systemd[1]: Started cri-containerd-c12906afa58629820f0afacb4581293e9143663acd69829fe97374dfe4ffcad1.scope - libcontainer container c12906afa58629820f0afacb4581293e9143663acd69829fe97374dfe4ffcad1. Jan 24 00:46:07.015000 audit: BPF prog-id=205 op=LOAD Jan 24 00:46:07.015484 systemd-networkd[1553]: calie933ef94a95: Link UP Jan 24 00:46:07.016000 audit: BPF prog-id=206 op=LOAD Jan 24 00:46:07.016000 audit[4683]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000106238 a2=98 a3=0 items=0 ppid=4672 pid=4683 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:07.016000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6331323930366166613538363239383230663061666163623435383132 Jan 24 00:46:07.016000 audit: BPF prog-id=206 op=UNLOAD Jan 24 00:46:07.016000 audit[4683]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4672 pid=4683 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:07.016000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6331323930366166613538363239383230663061666163623435383132 Jan 24 00:46:07.016000 audit: BPF prog-id=207 op=LOAD Jan 24 00:46:07.016000 audit[4683]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000106488 a2=98 a3=0 items=0 ppid=4672 pid=4683 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:07.016000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6331323930366166613538363239383230663061666163623435383132 Jan 24 00:46:07.016000 audit: BPF prog-id=208 op=LOAD Jan 24 00:46:07.016000 audit[4683]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000106218 a2=98 a3=0 items=0 ppid=4672 pid=4683 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:07.016000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6331323930366166613538363239383230663061666163623435383132 Jan 24 00:46:07.016000 audit: BPF prog-id=208 op=UNLOAD Jan 24 00:46:07.016000 audit[4683]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4672 pid=4683 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:07.016000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6331323930366166613538363239383230663061666163623435383132 Jan 24 00:46:07.016000 audit: BPF prog-id=207 op=UNLOAD Jan 24 00:46:07.016000 audit[4683]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4672 pid=4683 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:07.016000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6331323930366166613538363239383230663061666163623435383132 Jan 24 00:46:07.016000 audit: BPF prog-id=209 op=LOAD Jan 24 00:46:07.016000 audit[4683]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001066e8 a2=98 a3=0 items=0 ppid=4672 pid=4683 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:07.016000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6331323930366166613538363239383230663061666163623435383132 Jan 24 00:46:07.015762 systemd-networkd[1553]: calie933ef94a95: Gained carrier Jan 24 00:46:07.030357 containerd[1682]: 2026-01-24 00:46:06.704 [INFO][4548] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 24 00:46:07.030357 containerd[1682]: 2026-01-24 00:46:06.727 [INFO][4548] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4593--0--0--9--1308b066bf-k8s-goldmane--666569f655--qhldj-eth0 goldmane-666569f655- calico-system d0cc0ca8-4b85-478b-b9c2-c61b42d93c89 834 0 2026-01-24 00:45:39 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:666569f655 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4593-0-0-9-1308b066bf goldmane-666569f655-qhldj eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calie933ef94a95 [] [] }} ContainerID="d8fecce5d72eb368010025fa119cf35a34d61afa107134a7c3229da4268e400a" Namespace="calico-system" Pod="goldmane-666569f655-qhldj" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-goldmane--666569f655--qhldj-" Jan 24 00:46:07.030357 containerd[1682]: 2026-01-24 00:46:06.727 [INFO][4548] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d8fecce5d72eb368010025fa119cf35a34d61afa107134a7c3229da4268e400a" Namespace="calico-system" Pod="goldmane-666569f655-qhldj" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-goldmane--666569f655--qhldj-eth0" Jan 24 00:46:07.030357 containerd[1682]: 2026-01-24 00:46:06.765 [INFO][4592] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d8fecce5d72eb368010025fa119cf35a34d61afa107134a7c3229da4268e400a" HandleID="k8s-pod-network.d8fecce5d72eb368010025fa119cf35a34d61afa107134a7c3229da4268e400a" Workload="ci--4593--0--0--9--1308b066bf-k8s-goldmane--666569f655--qhldj-eth0" Jan 24 00:46:07.030357 containerd[1682]: 2026-01-24 00:46:06.765 [INFO][4592] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="d8fecce5d72eb368010025fa119cf35a34d61afa107134a7c3229da4268e400a" HandleID="k8s-pod-network.d8fecce5d72eb368010025fa119cf35a34d61afa107134a7c3229da4268e400a" Workload="ci--4593--0--0--9--1308b066bf-k8s-goldmane--666569f655--qhldj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d55a0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4593-0-0-9-1308b066bf", "pod":"goldmane-666569f655-qhldj", "timestamp":"2026-01-24 00:46:06.765376193 +0000 UTC"}, Hostname:"ci-4593-0-0-9-1308b066bf", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 24 00:46:07.030357 containerd[1682]: 2026-01-24 00:46:06.765 [INFO][4592] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 24 00:46:07.030357 containerd[1682]: 2026-01-24 00:46:06.912 [INFO][4592] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 24 00:46:07.030357 containerd[1682]: 2026-01-24 00:46:06.912 [INFO][4592] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4593-0-0-9-1308b066bf' Jan 24 00:46:07.030357 containerd[1682]: 2026-01-24 00:46:06.967 [INFO][4592] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d8fecce5d72eb368010025fa119cf35a34d61afa107134a7c3229da4268e400a" host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:07.030357 containerd[1682]: 2026-01-24 00:46:06.977 [INFO][4592] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:07.030357 containerd[1682]: 2026-01-24 00:46:06.981 [INFO][4592] ipam/ipam.go 511: Trying affinity for 192.168.51.64/26 host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:07.030357 containerd[1682]: 2026-01-24 00:46:06.982 [INFO][4592] ipam/ipam.go 158: Attempting to load block cidr=192.168.51.64/26 host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:07.030357 containerd[1682]: 2026-01-24 00:46:06.985 [INFO][4592] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.51.64/26 host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:07.030357 containerd[1682]: 2026-01-24 00:46:06.985 [INFO][4592] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.51.64/26 handle="k8s-pod-network.d8fecce5d72eb368010025fa119cf35a34d61afa107134a7c3229da4268e400a" host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:07.030357 containerd[1682]: 2026-01-24 00:46:06.986 [INFO][4592] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.d8fecce5d72eb368010025fa119cf35a34d61afa107134a7c3229da4268e400a Jan 24 00:46:07.030357 containerd[1682]: 2026-01-24 00:46:06.993 [INFO][4592] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.51.64/26 handle="k8s-pod-network.d8fecce5d72eb368010025fa119cf35a34d61afa107134a7c3229da4268e400a" host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:07.030357 containerd[1682]: 2026-01-24 00:46:07.007 [INFO][4592] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.51.71/26] block=192.168.51.64/26 handle="k8s-pod-network.d8fecce5d72eb368010025fa119cf35a34d61afa107134a7c3229da4268e400a" host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:07.030357 containerd[1682]: 2026-01-24 00:46:07.007 [INFO][4592] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.51.71/26] handle="k8s-pod-network.d8fecce5d72eb368010025fa119cf35a34d61afa107134a7c3229da4268e400a" host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:07.030357 containerd[1682]: 2026-01-24 00:46:07.007 [INFO][4592] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 24 00:46:07.030357 containerd[1682]: 2026-01-24 00:46:07.007 [INFO][4592] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.51.71/26] IPv6=[] ContainerID="d8fecce5d72eb368010025fa119cf35a34d61afa107134a7c3229da4268e400a" HandleID="k8s-pod-network.d8fecce5d72eb368010025fa119cf35a34d61afa107134a7c3229da4268e400a" Workload="ci--4593--0--0--9--1308b066bf-k8s-goldmane--666569f655--qhldj-eth0" Jan 24 00:46:07.030768 containerd[1682]: 2026-01-24 00:46:07.011 [INFO][4548] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d8fecce5d72eb368010025fa119cf35a34d61afa107134a7c3229da4268e400a" Namespace="calico-system" Pod="goldmane-666569f655-qhldj" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-goldmane--666569f655--qhldj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--9--1308b066bf-k8s-goldmane--666569f655--qhldj-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"d0cc0ca8-4b85-478b-b9c2-c61b42d93c89", ResourceVersion:"834", Generation:0, CreationTimestamp:time.Date(2026, time.January, 24, 0, 45, 39, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-9-1308b066bf", ContainerID:"", Pod:"goldmane-666569f655-qhldj", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.51.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calie933ef94a95", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 24 00:46:07.030768 containerd[1682]: 2026-01-24 00:46:07.012 [INFO][4548] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.51.71/32] ContainerID="d8fecce5d72eb368010025fa119cf35a34d61afa107134a7c3229da4268e400a" Namespace="calico-system" Pod="goldmane-666569f655-qhldj" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-goldmane--666569f655--qhldj-eth0" Jan 24 00:46:07.030768 containerd[1682]: 2026-01-24 00:46:07.012 [INFO][4548] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie933ef94a95 ContainerID="d8fecce5d72eb368010025fa119cf35a34d61afa107134a7c3229da4268e400a" Namespace="calico-system" Pod="goldmane-666569f655-qhldj" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-goldmane--666569f655--qhldj-eth0" Jan 24 00:46:07.030768 containerd[1682]: 2026-01-24 00:46:07.013 [INFO][4548] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d8fecce5d72eb368010025fa119cf35a34d61afa107134a7c3229da4268e400a" Namespace="calico-system" Pod="goldmane-666569f655-qhldj" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-goldmane--666569f655--qhldj-eth0" Jan 24 00:46:07.030768 containerd[1682]: 2026-01-24 00:46:07.014 [INFO][4548] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d8fecce5d72eb368010025fa119cf35a34d61afa107134a7c3229da4268e400a" Namespace="calico-system" Pod="goldmane-666569f655-qhldj" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-goldmane--666569f655--qhldj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--9--1308b066bf-k8s-goldmane--666569f655--qhldj-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"d0cc0ca8-4b85-478b-b9c2-c61b42d93c89", ResourceVersion:"834", Generation:0, CreationTimestamp:time.Date(2026, time.January, 24, 0, 45, 39, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-9-1308b066bf", ContainerID:"d8fecce5d72eb368010025fa119cf35a34d61afa107134a7c3229da4268e400a", Pod:"goldmane-666569f655-qhldj", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.51.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calie933ef94a95", MAC:"8e:31:de:f8:9e:7d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 24 00:46:07.030768 containerd[1682]: 2026-01-24 00:46:07.025 [INFO][4548] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d8fecce5d72eb368010025fa119cf35a34d61afa107134a7c3229da4268e400a" Namespace="calico-system" Pod="goldmane-666569f655-qhldj" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-goldmane--666569f655--qhldj-eth0" Jan 24 00:46:07.051584 containerd[1682]: time="2026-01-24T00:46:07.051545186Z" level=info msg="connecting to shim d8fecce5d72eb368010025fa119cf35a34d61afa107134a7c3229da4268e400a" address="unix:///run/containerd/s/2f0dceb7546ee24d4d3a3289bded57dd339465b6497a3b3900cdbbc7e2f4ad41" namespace=k8s.io protocol=ttrpc version=3 Jan 24 00:46:07.064295 containerd[1682]: time="2026-01-24T00:46:07.064241119Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-dfsrt,Uid:0a9fe640-bc48-43ab-ab91-0adb30a0cc77,Namespace:kube-system,Attempt:0,} returns sandbox id \"c12906afa58629820f0afacb4581293e9143663acd69829fe97374dfe4ffcad1\"" Jan 24 00:46:07.071246 containerd[1682]: time="2026-01-24T00:46:07.071198299Z" level=info msg="CreateContainer within sandbox \"c12906afa58629820f0afacb4581293e9143663acd69829fe97374dfe4ffcad1\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 24 00:46:07.080248 systemd[1]: Started cri-containerd-d8fecce5d72eb368010025fa119cf35a34d61afa107134a7c3229da4268e400a.scope - libcontainer container d8fecce5d72eb368010025fa119cf35a34d61afa107134a7c3229da4268e400a. Jan 24 00:46:07.081658 containerd[1682]: time="2026-01-24T00:46:07.081617041Z" level=info msg="Container 5d689b5a0b7463a5791ec5d849348096d8c98ab6487253dc96e6bb3d0a30c042: CDI devices from CRI Config.CDIDevices: []" Jan 24 00:46:07.087472 containerd[1682]: time="2026-01-24T00:46:07.087443803Z" level=info msg="CreateContainer within sandbox \"c12906afa58629820f0afacb4581293e9143663acd69829fe97374dfe4ffcad1\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"5d689b5a0b7463a5791ec5d849348096d8c98ab6487253dc96e6bb3d0a30c042\"" Jan 24 00:46:07.089573 containerd[1682]: time="2026-01-24T00:46:07.089463163Z" level=info msg="StartContainer for \"5d689b5a0b7463a5791ec5d849348096d8c98ab6487253dc96e6bb3d0a30c042\"" Jan 24 00:46:07.091023 containerd[1682]: time="2026-01-24T00:46:07.090915133Z" level=info msg="connecting to shim 5d689b5a0b7463a5791ec5d849348096d8c98ab6487253dc96e6bb3d0a30c042" address="unix:///run/containerd/s/627808e63a0aa03586f9b5969acb2dacc53e1909f35a200de743bdd622b18196" protocol=ttrpc version=3 Jan 24 00:46:07.092000 audit: BPF prog-id=210 op=LOAD Jan 24 00:46:07.092000 audit: BPF prog-id=211 op=LOAD Jan 24 00:46:07.092000 audit[4733]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a0238 a2=98 a3=0 items=0 ppid=4717 pid=4733 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:07.092000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6438666563636535643732656233363830313030323566613131396366 Jan 24 00:46:07.092000 audit: BPF prog-id=211 op=UNLOAD Jan 24 00:46:07.092000 audit[4733]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4717 pid=4733 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:07.092000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6438666563636535643732656233363830313030323566613131396366 Jan 24 00:46:07.092000 audit: BPF prog-id=212 op=LOAD Jan 24 00:46:07.092000 audit[4733]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a0488 a2=98 a3=0 items=0 ppid=4717 pid=4733 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:07.092000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6438666563636535643732656233363830313030323566613131396366 Jan 24 00:46:07.092000 audit: BPF prog-id=213 op=LOAD Jan 24 00:46:07.092000 audit[4733]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c0001a0218 a2=98 a3=0 items=0 ppid=4717 pid=4733 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:07.092000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6438666563636535643732656233363830313030323566613131396366 Jan 24 00:46:07.092000 audit: BPF prog-id=213 op=UNLOAD Jan 24 00:46:07.092000 audit[4733]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4717 pid=4733 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:07.092000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6438666563636535643732656233363830313030323566613131396366 Jan 24 00:46:07.092000 audit: BPF prog-id=212 op=UNLOAD Jan 24 00:46:07.092000 audit[4733]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4717 pid=4733 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:07.092000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6438666563636535643732656233363830313030323566613131396366 Jan 24 00:46:07.092000 audit: BPF prog-id=214 op=LOAD Jan 24 00:46:07.092000 audit[4733]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a06e8 a2=98 a3=0 items=0 ppid=4717 pid=4733 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:07.092000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6438666563636535643732656233363830313030323566613131396366 Jan 24 00:46:07.113484 systemd[1]: Started cri-containerd-5d689b5a0b7463a5791ec5d849348096d8c98ab6487253dc96e6bb3d0a30c042.scope - libcontainer container 5d689b5a0b7463a5791ec5d849348096d8c98ab6487253dc96e6bb3d0a30c042. Jan 24 00:46:07.128000 audit: BPF prog-id=215 op=LOAD Jan 24 00:46:07.129000 audit: BPF prog-id=216 op=LOAD Jan 24 00:46:07.129000 audit[4752]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000138238 a2=98 a3=0 items=0 ppid=4672 pid=4752 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:07.129000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3564363839623561306237343633613537393165633564383439333438 Jan 24 00:46:07.129000 audit: BPF prog-id=216 op=UNLOAD Jan 24 00:46:07.129000 audit[4752]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4672 pid=4752 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:07.129000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3564363839623561306237343633613537393165633564383439333438 Jan 24 00:46:07.130000 audit: BPF prog-id=217 op=LOAD Jan 24 00:46:07.130000 audit[4752]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000138488 a2=98 a3=0 items=0 ppid=4672 pid=4752 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:07.130000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3564363839623561306237343633613537393165633564383439333438 Jan 24 00:46:07.130000 audit: BPF prog-id=218 op=LOAD Jan 24 00:46:07.130000 audit[4752]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000138218 a2=98 a3=0 items=0 ppid=4672 pid=4752 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:07.130000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3564363839623561306237343633613537393165633564383439333438 Jan 24 00:46:07.130000 audit: BPF prog-id=218 op=UNLOAD Jan 24 00:46:07.130000 audit[4752]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4672 pid=4752 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:07.130000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3564363839623561306237343633613537393165633564383439333438 Jan 24 00:46:07.130000 audit: BPF prog-id=217 op=UNLOAD Jan 24 00:46:07.130000 audit[4752]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4672 pid=4752 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:07.130000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3564363839623561306237343633613537393165633564383439333438 Jan 24 00:46:07.131000 audit: BPF prog-id=219 op=LOAD Jan 24 00:46:07.131000 audit[4752]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001386e8 a2=98 a3=0 items=0 ppid=4672 pid=4752 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:07.131000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3564363839623561306237343633613537393165633564383439333438 Jan 24 00:46:07.141835 containerd[1682]: time="2026-01-24T00:46:07.140572302Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-qhldj,Uid:d0cc0ca8-4b85-478b-b9c2-c61b42d93c89,Namespace:calico-system,Attempt:0,} returns sandbox id \"d8fecce5d72eb368010025fa119cf35a34d61afa107134a7c3229da4268e400a\"" Jan 24 00:46:07.158220 containerd[1682]: time="2026-01-24T00:46:07.158195086Z" level=info msg="StartContainer for \"5d689b5a0b7463a5791ec5d849348096d8c98ab6487253dc96e6bb3d0a30c042\" returns successfully" Jan 24 00:46:07.357773 systemd-networkd[1553]: cali31e920bd568: Gained IPv6LL Jan 24 00:46:07.371888 containerd[1682]: time="2026-01-24T00:46:07.371815036Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 24 00:46:07.379124 containerd[1682]: time="2026-01-24T00:46:07.377155986Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 24 00:46:07.379124 containerd[1682]: time="2026-01-24T00:46:07.377269406Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 24 00:46:07.379353 kubelet[2863]: E0124 00:46:07.378146 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 24 00:46:07.379353 kubelet[2863]: E0124 00:46:07.378227 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 24 00:46:07.380874 containerd[1682]: time="2026-01-24T00:46:07.380751507Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 24 00:46:07.387603 kubelet[2863]: E0124 00:46:07.387529 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fs8rv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-njf24_calico-system(12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 24 00:46:07.611430 systemd-networkd[1553]: cali3f283f077c2: Gained IPv6LL Jan 24 00:46:07.616086 containerd[1682]: time="2026-01-24T00:46:07.616001500Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7779cd58b4-jsxfd,Uid:7fb26181-9fdc-4f96-be2c-85fbaa5f21b7,Namespace:calico-apiserver,Attempt:0,}" Jan 24 00:46:07.763986 systemd-networkd[1553]: calid3bb2583a5d: Link UP Jan 24 00:46:07.764422 systemd-networkd[1553]: calid3bb2583a5d: Gained carrier Jan 24 00:46:07.771396 containerd[1682]: 2026-01-24 00:46:07.669 [INFO][4810] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 24 00:46:07.771396 containerd[1682]: 2026-01-24 00:46:07.689 [INFO][4810] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4593--0--0--9--1308b066bf-k8s-calico--apiserver--7779cd58b4--jsxfd-eth0 calico-apiserver-7779cd58b4- calico-apiserver 7fb26181-9fdc-4f96-be2c-85fbaa5f21b7 839 0 2026-01-24 00:45:37 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7779cd58b4 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4593-0-0-9-1308b066bf calico-apiserver-7779cd58b4-jsxfd eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calid3bb2583a5d [] [] }} ContainerID="29c73deb7a5e94e73efade8c4fe4fbb54e87f88d231e00c3735cd2e1b2dc0cae" Namespace="calico-apiserver" Pod="calico-apiserver-7779cd58b4-jsxfd" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-calico--apiserver--7779cd58b4--jsxfd-" Jan 24 00:46:07.771396 containerd[1682]: 2026-01-24 00:46:07.689 [INFO][4810] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="29c73deb7a5e94e73efade8c4fe4fbb54e87f88d231e00c3735cd2e1b2dc0cae" Namespace="calico-apiserver" Pod="calico-apiserver-7779cd58b4-jsxfd" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-calico--apiserver--7779cd58b4--jsxfd-eth0" Jan 24 00:46:07.771396 containerd[1682]: 2026-01-24 00:46:07.719 [INFO][4822] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="29c73deb7a5e94e73efade8c4fe4fbb54e87f88d231e00c3735cd2e1b2dc0cae" HandleID="k8s-pod-network.29c73deb7a5e94e73efade8c4fe4fbb54e87f88d231e00c3735cd2e1b2dc0cae" Workload="ci--4593--0--0--9--1308b066bf-k8s-calico--apiserver--7779cd58b4--jsxfd-eth0" Jan 24 00:46:07.771396 containerd[1682]: 2026-01-24 00:46:07.719 [INFO][4822] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="29c73deb7a5e94e73efade8c4fe4fbb54e87f88d231e00c3735cd2e1b2dc0cae" HandleID="k8s-pod-network.29c73deb7a5e94e73efade8c4fe4fbb54e87f88d231e00c3735cd2e1b2dc0cae" Workload="ci--4593--0--0--9--1308b066bf-k8s-calico--apiserver--7779cd58b4--jsxfd-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d4fe0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4593-0-0-9-1308b066bf", "pod":"calico-apiserver-7779cd58b4-jsxfd", "timestamp":"2026-01-24 00:46:07.71908787 +0000 UTC"}, Hostname:"ci-4593-0-0-9-1308b066bf", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 24 00:46:07.771396 containerd[1682]: 2026-01-24 00:46:07.719 [INFO][4822] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 24 00:46:07.771396 containerd[1682]: 2026-01-24 00:46:07.719 [INFO][4822] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 24 00:46:07.771396 containerd[1682]: 2026-01-24 00:46:07.719 [INFO][4822] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4593-0-0-9-1308b066bf' Jan 24 00:46:07.771396 containerd[1682]: 2026-01-24 00:46:07.726 [INFO][4822] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.29c73deb7a5e94e73efade8c4fe4fbb54e87f88d231e00c3735cd2e1b2dc0cae" host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:07.771396 containerd[1682]: 2026-01-24 00:46:07.730 [INFO][4822] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:07.771396 containerd[1682]: 2026-01-24 00:46:07.734 [INFO][4822] ipam/ipam.go 511: Trying affinity for 192.168.51.64/26 host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:07.771396 containerd[1682]: 2026-01-24 00:46:07.735 [INFO][4822] ipam/ipam.go 158: Attempting to load block cidr=192.168.51.64/26 host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:07.771396 containerd[1682]: 2026-01-24 00:46:07.737 [INFO][4822] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.51.64/26 host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:07.771396 containerd[1682]: 2026-01-24 00:46:07.737 [INFO][4822] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.51.64/26 handle="k8s-pod-network.29c73deb7a5e94e73efade8c4fe4fbb54e87f88d231e00c3735cd2e1b2dc0cae" host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:07.771396 containerd[1682]: 2026-01-24 00:46:07.739 [INFO][4822] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.29c73deb7a5e94e73efade8c4fe4fbb54e87f88d231e00c3735cd2e1b2dc0cae Jan 24 00:46:07.771396 containerd[1682]: 2026-01-24 00:46:07.745 [INFO][4822] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.51.64/26 handle="k8s-pod-network.29c73deb7a5e94e73efade8c4fe4fbb54e87f88d231e00c3735cd2e1b2dc0cae" host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:07.771396 containerd[1682]: 2026-01-24 00:46:07.751 [INFO][4822] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.51.72/26] block=192.168.51.64/26 handle="k8s-pod-network.29c73deb7a5e94e73efade8c4fe4fbb54e87f88d231e00c3735cd2e1b2dc0cae" host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:07.771396 containerd[1682]: 2026-01-24 00:46:07.752 [INFO][4822] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.51.72/26] handle="k8s-pod-network.29c73deb7a5e94e73efade8c4fe4fbb54e87f88d231e00c3735cd2e1b2dc0cae" host="ci-4593-0-0-9-1308b066bf" Jan 24 00:46:07.771396 containerd[1682]: 2026-01-24 00:46:07.752 [INFO][4822] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 24 00:46:07.771396 containerd[1682]: 2026-01-24 00:46:07.752 [INFO][4822] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.51.72/26] IPv6=[] ContainerID="29c73deb7a5e94e73efade8c4fe4fbb54e87f88d231e00c3735cd2e1b2dc0cae" HandleID="k8s-pod-network.29c73deb7a5e94e73efade8c4fe4fbb54e87f88d231e00c3735cd2e1b2dc0cae" Workload="ci--4593--0--0--9--1308b066bf-k8s-calico--apiserver--7779cd58b4--jsxfd-eth0" Jan 24 00:46:07.771976 containerd[1682]: 2026-01-24 00:46:07.756 [INFO][4810] cni-plugin/k8s.go 418: Populated endpoint ContainerID="29c73deb7a5e94e73efade8c4fe4fbb54e87f88d231e00c3735cd2e1b2dc0cae" Namespace="calico-apiserver" Pod="calico-apiserver-7779cd58b4-jsxfd" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-calico--apiserver--7779cd58b4--jsxfd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--9--1308b066bf-k8s-calico--apiserver--7779cd58b4--jsxfd-eth0", GenerateName:"calico-apiserver-7779cd58b4-", Namespace:"calico-apiserver", SelfLink:"", UID:"7fb26181-9fdc-4f96-be2c-85fbaa5f21b7", ResourceVersion:"839", Generation:0, CreationTimestamp:time.Date(2026, time.January, 24, 0, 45, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7779cd58b4", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-9-1308b066bf", ContainerID:"", Pod:"calico-apiserver-7779cd58b4-jsxfd", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.51.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calid3bb2583a5d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 24 00:46:07.771976 containerd[1682]: 2026-01-24 00:46:07.756 [INFO][4810] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.51.72/32] ContainerID="29c73deb7a5e94e73efade8c4fe4fbb54e87f88d231e00c3735cd2e1b2dc0cae" Namespace="calico-apiserver" Pod="calico-apiserver-7779cd58b4-jsxfd" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-calico--apiserver--7779cd58b4--jsxfd-eth0" Jan 24 00:46:07.771976 containerd[1682]: 2026-01-24 00:46:07.756 [INFO][4810] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid3bb2583a5d ContainerID="29c73deb7a5e94e73efade8c4fe4fbb54e87f88d231e00c3735cd2e1b2dc0cae" Namespace="calico-apiserver" Pod="calico-apiserver-7779cd58b4-jsxfd" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-calico--apiserver--7779cd58b4--jsxfd-eth0" Jan 24 00:46:07.771976 containerd[1682]: 2026-01-24 00:46:07.758 [INFO][4810] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="29c73deb7a5e94e73efade8c4fe4fbb54e87f88d231e00c3735cd2e1b2dc0cae" Namespace="calico-apiserver" Pod="calico-apiserver-7779cd58b4-jsxfd" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-calico--apiserver--7779cd58b4--jsxfd-eth0" Jan 24 00:46:07.771976 containerd[1682]: 2026-01-24 00:46:07.759 [INFO][4810] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="29c73deb7a5e94e73efade8c4fe4fbb54e87f88d231e00c3735cd2e1b2dc0cae" Namespace="calico-apiserver" Pod="calico-apiserver-7779cd58b4-jsxfd" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-calico--apiserver--7779cd58b4--jsxfd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--9--1308b066bf-k8s-calico--apiserver--7779cd58b4--jsxfd-eth0", GenerateName:"calico-apiserver-7779cd58b4-", Namespace:"calico-apiserver", SelfLink:"", UID:"7fb26181-9fdc-4f96-be2c-85fbaa5f21b7", ResourceVersion:"839", Generation:0, CreationTimestamp:time.Date(2026, time.January, 24, 0, 45, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7779cd58b4", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-9-1308b066bf", ContainerID:"29c73deb7a5e94e73efade8c4fe4fbb54e87f88d231e00c3735cd2e1b2dc0cae", Pod:"calico-apiserver-7779cd58b4-jsxfd", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.51.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calid3bb2583a5d", MAC:"fa:02:2a:9c:a6:fc", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 24 00:46:07.771976 containerd[1682]: 2026-01-24 00:46:07.768 [INFO][4810] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="29c73deb7a5e94e73efade8c4fe4fbb54e87f88d231e00c3735cd2e1b2dc0cae" Namespace="calico-apiserver" Pod="calico-apiserver-7779cd58b4-jsxfd" WorkloadEndpoint="ci--4593--0--0--9--1308b066bf-k8s-calico--apiserver--7779cd58b4--jsxfd-eth0" Jan 24 00:46:07.800526 containerd[1682]: time="2026-01-24T00:46:07.800459916Z" level=info msg="connecting to shim 29c73deb7a5e94e73efade8c4fe4fbb54e87f88d231e00c3735cd2e1b2dc0cae" address="unix:///run/containerd/s/3fb563471bb7c4f933eeb6b6d77cf594902a043f3053524ab78aae9f559aa8c5" namespace=k8s.io protocol=ttrpc version=3 Jan 24 00:46:07.822778 kubelet[2863]: E0124 00:46:07.822705 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:46:07.824389 kubelet[2863]: E0124 00:46:07.824367 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" podUID="d83d59e3-6296-40d9-bb63-5a69b654ac0c" Jan 24 00:46:07.836473 kubelet[2863]: I0124 00:46:07.836184 2863 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-dfsrt" podStartSLOduration=39.836151853 podStartE2EDuration="39.836151853s" podCreationTimestamp="2026-01-24 00:45:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-24 00:46:07.834146522 +0000 UTC m=+44.366133349" watchObservedRunningTime="2026-01-24 00:46:07.836151853 +0000 UTC m=+44.368138670" Jan 24 00:46:07.838046 containerd[1682]: time="2026-01-24T00:46:07.838016152Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 24 00:46:07.840272 containerd[1682]: time="2026-01-24T00:46:07.839999403Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 24 00:46:07.840342 containerd[1682]: time="2026-01-24T00:46:07.840329303Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 24 00:46:07.840904 kubelet[2863]: E0124 00:46:07.840563 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 24 00:46:07.841073 kubelet[2863]: E0124 00:46:07.841030 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 24 00:46:07.843267 containerd[1682]: time="2026-01-24T00:46:07.843210273Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 24 00:46:07.843959 kubelet[2863]: E0124 00:46:07.843463 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w7bh8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-qhldj_calico-system(d0cc0ca8-4b85-478b-b9c2-c61b42d93c89): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 24 00:46:07.845183 kubelet[2863]: E0124 00:46:07.845166 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qhldj" podUID="d0cc0ca8-4b85-478b-b9c2-c61b42d93c89" Jan 24 00:46:07.864391 systemd[1]: Started cri-containerd-29c73deb7a5e94e73efade8c4fe4fbb54e87f88d231e00c3735cd2e1b2dc0cae.scope - libcontainer container 29c73deb7a5e94e73efade8c4fe4fbb54e87f88d231e00c3735cd2e1b2dc0cae. Jan 24 00:46:07.871000 audit[4868]: NETFILTER_CFG table=filter:123 family=2 entries=16 op=nft_register_rule pid=4868 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 24 00:46:07.871000 audit[4868]: SYSCALL arch=c000003e syscall=46 success=yes exit=5992 a0=3 a1=7ffc1f011230 a2=0 a3=7ffc1f01121c items=0 ppid=2972 pid=4868 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:07.871000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 24 00:46:07.877000 audit[4868]: NETFILTER_CFG table=nat:124 family=2 entries=42 op=nft_register_rule pid=4868 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 24 00:46:07.877000 audit[4868]: SYSCALL arch=c000003e syscall=46 success=yes exit=13428 a0=3 a1=7ffc1f011230 a2=0 a3=7ffc1f01121c items=0 ppid=2972 pid=4868 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:07.877000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 24 00:46:07.889000 audit: BPF prog-id=220 op=LOAD Jan 24 00:46:07.889000 audit: BPF prog-id=221 op=LOAD Jan 24 00:46:07.889000 audit[4854]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000186238 a2=98 a3=0 items=0 ppid=4843 pid=4854 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:07.889000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3239633733646562376135653934653733656661646538633466653466 Jan 24 00:46:07.890000 audit: BPF prog-id=221 op=UNLOAD Jan 24 00:46:07.890000 audit[4854]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4843 pid=4854 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:07.890000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3239633733646562376135653934653733656661646538633466653466 Jan 24 00:46:07.890000 audit: BPF prog-id=222 op=LOAD Jan 24 00:46:07.890000 audit[4854]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000186488 a2=98 a3=0 items=0 ppid=4843 pid=4854 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:07.890000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3239633733646562376135653934653733656661646538633466653466 Jan 24 00:46:07.890000 audit: BPF prog-id=223 op=LOAD Jan 24 00:46:07.890000 audit[4854]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000186218 a2=98 a3=0 items=0 ppid=4843 pid=4854 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:07.890000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3239633733646562376135653934653733656661646538633466653466 Jan 24 00:46:07.890000 audit: BPF prog-id=223 op=UNLOAD Jan 24 00:46:07.890000 audit[4854]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4843 pid=4854 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:07.890000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3239633733646562376135653934653733656661646538633466653466 Jan 24 00:46:07.890000 audit: BPF prog-id=222 op=UNLOAD Jan 24 00:46:07.890000 audit[4854]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4843 pid=4854 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:07.890000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3239633733646562376135653934653733656661646538633466653466 Jan 24 00:46:07.890000 audit: BPF prog-id=224 op=LOAD Jan 24 00:46:07.890000 audit[4854]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001866e8 a2=98 a3=0 items=0 ppid=4843 pid=4854 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:07.890000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3239633733646562376135653934653733656661646538633466653466 Jan 24 00:46:07.909000 audit[4878]: NETFILTER_CFG table=filter:125 family=2 entries=16 op=nft_register_rule pid=4878 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 24 00:46:07.909000 audit[4878]: SYSCALL arch=c000003e syscall=46 success=yes exit=5992 a0=3 a1=7ffcda9a3b80 a2=0 a3=7ffcda9a3b6c items=0 ppid=2972 pid=4878 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:07.909000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 24 00:46:07.914000 audit[4878]: NETFILTER_CFG table=nat:126 family=2 entries=18 op=nft_register_rule pid=4878 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 24 00:46:07.914000 audit[4878]: SYSCALL arch=c000003e syscall=46 success=yes exit=5004 a0=3 a1=7ffcda9a3b80 a2=0 a3=0 items=0 ppid=2972 pid=4878 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:07.914000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 24 00:46:07.926494 containerd[1682]: time="2026-01-24T00:46:07.926442769Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7779cd58b4-jsxfd,Uid:7fb26181-9fdc-4f96-be2c-85fbaa5f21b7,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"29c73deb7a5e94e73efade8c4fe4fbb54e87f88d231e00c3735cd2e1b2dc0cae\"" Jan 24 00:46:08.267557 containerd[1682]: time="2026-01-24T00:46:08.267258903Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 24 00:46:08.269366 containerd[1682]: time="2026-01-24T00:46:08.269095933Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 24 00:46:08.269366 containerd[1682]: time="2026-01-24T00:46:08.269258163Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 24 00:46:08.270026 kubelet[2863]: E0124 00:46:08.269969 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 24 00:46:08.270595 kubelet[2863]: E0124 00:46:08.270027 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 24 00:46:08.270595 kubelet[2863]: E0124 00:46:08.270400 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fs8rv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-njf24_calico-system(12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 24 00:46:08.273193 containerd[1682]: time="2026-01-24T00:46:08.271267983Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 24 00:46:08.273858 kubelet[2863]: E0124 00:46:08.271827 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:46:08.316802 systemd-networkd[1553]: calie933ef94a95: Gained IPv6LL Jan 24 00:46:08.318866 systemd-networkd[1553]: cali11e6e378321: Gained IPv6LL Jan 24 00:46:08.572260 systemd-networkd[1553]: calib96b4d678f9: Gained IPv6LL Jan 24 00:46:08.718459 containerd[1682]: time="2026-01-24T00:46:08.718379525Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 24 00:46:08.720221 containerd[1682]: time="2026-01-24T00:46:08.720174145Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 24 00:46:08.720327 containerd[1682]: time="2026-01-24T00:46:08.720211255Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 24 00:46:08.721271 kubelet[2863]: E0124 00:46:08.721147 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 24 00:46:08.721587 kubelet[2863]: E0124 00:46:08.721541 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 24 00:46:08.721915 kubelet[2863]: E0124 00:46:08.721810 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-llmb2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-7779cd58b4-jsxfd_calico-apiserver(7fb26181-9fdc-4f96-be2c-85fbaa5f21b7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 24 00:46:08.723183 kubelet[2863]: E0124 00:46:08.723133 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" podUID="7fb26181-9fdc-4f96-be2c-85fbaa5f21b7" Jan 24 00:46:08.831749 kubelet[2863]: E0124 00:46:08.829324 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qhldj" podUID="d0cc0ca8-4b85-478b-b9c2-c61b42d93c89" Jan 24 00:46:08.831749 kubelet[2863]: E0124 00:46:08.830599 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" podUID="7fb26181-9fdc-4f96-be2c-85fbaa5f21b7" Jan 24 00:46:08.833224 kubelet[2863]: E0124 00:46:08.832771 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:46:08.902000 audit[4908]: NETFILTER_CFG table=filter:127 family=2 entries=16 op=nft_register_rule pid=4908 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 24 00:46:08.902000 audit[4908]: SYSCALL arch=c000003e syscall=46 success=yes exit=5992 a0=3 a1=7ffeaa7031e0 a2=0 a3=7ffeaa7031cc items=0 ppid=2972 pid=4908 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:08.902000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 24 00:46:08.917000 audit[4908]: NETFILTER_CFG table=nat:128 family=2 entries=54 op=nft_register_chain pid=4908 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 24 00:46:08.917000 audit[4908]: SYSCALL arch=c000003e syscall=46 success=yes exit=19092 a0=3 a1=7ffeaa7031e0 a2=0 a3=7ffeaa7031cc items=0 ppid=2972 pid=4908 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:08.917000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 24 00:46:09.211433 systemd-networkd[1553]: calid3bb2583a5d: Gained IPv6LL Jan 24 00:46:09.836166 kubelet[2863]: E0124 00:46:09.835442 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" podUID="7fb26181-9fdc-4f96-be2c-85fbaa5f21b7" Jan 24 00:46:09.956000 audit[4932]: NETFILTER_CFG table=filter:129 family=2 entries=16 op=nft_register_rule pid=4932 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 24 00:46:09.956000 audit[4932]: SYSCALL arch=c000003e syscall=46 success=yes exit=5992 a0=3 a1=7ffeadb65c80 a2=0 a3=7ffeadb65c6c items=0 ppid=2972 pid=4932 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:09.956000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 24 00:46:09.962000 audit[4932]: NETFILTER_CFG table=nat:130 family=2 entries=18 op=nft_register_rule pid=4932 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 24 00:46:09.962000 audit[4932]: SYSCALL arch=c000003e syscall=46 success=yes exit=5004 a0=3 a1=7ffeadb65c80 a2=0 a3=0 items=0 ppid=2972 pid=4932 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:09.962000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 24 00:46:10.370484 kubelet[2863]: I0124 00:46:10.368854 2863 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 24 00:46:10.973000 audit[4976]: NETFILTER_CFG table=filter:131 family=2 entries=15 op=nft_register_rule pid=4976 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 24 00:46:10.973000 audit[4976]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7ffe97196ba0 a2=0 a3=7ffe97196b8c items=0 ppid=2972 pid=4976 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:10.973000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 24 00:46:10.977000 audit[4976]: NETFILTER_CFG table=nat:132 family=2 entries=25 op=nft_register_chain pid=4976 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 24 00:46:10.977000 audit[4976]: SYSCALL arch=c000003e syscall=46 success=yes exit=8580 a0=3 a1=7ffe97196ba0 a2=0 a3=7ffe97196b8c items=0 ppid=2972 pid=4976 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:10.977000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 24 00:46:11.580000 audit: BPF prog-id=225 op=LOAD Jan 24 00:46:11.580000 audit[4993]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffd4bd74c50 a2=98 a3=1fffffffffffffff items=0 ppid=4977 pid=4993 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:11.580000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 24 00:46:11.581000 audit: BPF prog-id=225 op=UNLOAD Jan 24 00:46:11.581000 audit[4993]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7ffd4bd74c20 a3=0 items=0 ppid=4977 pid=4993 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:11.581000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 24 00:46:11.581000 audit: BPF prog-id=226 op=LOAD Jan 24 00:46:11.581000 audit[4993]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffd4bd74b30 a2=94 a3=3 items=0 ppid=4977 pid=4993 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:11.581000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 24 00:46:11.581000 audit: BPF prog-id=226 op=UNLOAD Jan 24 00:46:11.581000 audit[4993]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffd4bd74b30 a2=94 a3=3 items=0 ppid=4977 pid=4993 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:11.581000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 24 00:46:11.581000 audit: BPF prog-id=227 op=LOAD Jan 24 00:46:11.581000 audit[4993]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffd4bd74b70 a2=94 a3=7ffd4bd74d50 items=0 ppid=4977 pid=4993 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:11.581000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 24 00:46:11.581000 audit: BPF prog-id=227 op=UNLOAD Jan 24 00:46:11.581000 audit[4993]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffd4bd74b70 a2=94 a3=7ffd4bd74d50 items=0 ppid=4977 pid=4993 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:11.581000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 24 00:46:11.585000 audit: BPF prog-id=228 op=LOAD Jan 24 00:46:11.585000 audit[4994]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffd964aabf0 a2=98 a3=3 items=0 ppid=4977 pid=4994 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:11.585000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 24 00:46:11.586000 audit: BPF prog-id=228 op=UNLOAD Jan 24 00:46:11.586000 audit[4994]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7ffd964aabc0 a3=0 items=0 ppid=4977 pid=4994 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:11.586000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 24 00:46:11.586000 audit: BPF prog-id=229 op=LOAD Jan 24 00:46:11.586000 audit[4994]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffd964aa9e0 a2=94 a3=54428f items=0 ppid=4977 pid=4994 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:11.586000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 24 00:46:11.586000 audit: BPF prog-id=229 op=UNLOAD Jan 24 00:46:11.586000 audit[4994]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffd964aa9e0 a2=94 a3=54428f items=0 ppid=4977 pid=4994 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:11.586000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 24 00:46:11.586000 audit: BPF prog-id=230 op=LOAD Jan 24 00:46:11.586000 audit[4994]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffd964aaa10 a2=94 a3=2 items=0 ppid=4977 pid=4994 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:11.586000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 24 00:46:11.586000 audit: BPF prog-id=230 op=UNLOAD Jan 24 00:46:11.586000 audit[4994]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffd964aaa10 a2=0 a3=2 items=0 ppid=4977 pid=4994 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:11.586000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 24 00:46:11.796000 audit: BPF prog-id=231 op=LOAD Jan 24 00:46:11.796000 audit[4994]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffd964aa8d0 a2=94 a3=1 items=0 ppid=4977 pid=4994 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:11.796000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 24 00:46:11.797000 audit: BPF prog-id=231 op=UNLOAD Jan 24 00:46:11.797000 audit[4994]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffd964aa8d0 a2=94 a3=1 items=0 ppid=4977 pid=4994 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:11.797000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 24 00:46:11.816000 audit: BPF prog-id=232 op=LOAD Jan 24 00:46:11.816000 audit[4994]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffd964aa8c0 a2=94 a3=4 items=0 ppid=4977 pid=4994 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:11.816000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 24 00:46:11.816000 audit: BPF prog-id=232 op=UNLOAD Jan 24 00:46:11.816000 audit[4994]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7ffd964aa8c0 a2=0 a3=4 items=0 ppid=4977 pid=4994 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:11.816000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 24 00:46:11.817000 audit: BPF prog-id=233 op=LOAD Jan 24 00:46:11.817000 audit[4994]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffd964aa720 a2=94 a3=5 items=0 ppid=4977 pid=4994 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:11.817000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 24 00:46:11.817000 audit: BPF prog-id=233 op=UNLOAD Jan 24 00:46:11.817000 audit[4994]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7ffd964aa720 a2=0 a3=5 items=0 ppid=4977 pid=4994 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:11.817000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 24 00:46:11.817000 audit: BPF prog-id=234 op=LOAD Jan 24 00:46:11.817000 audit[4994]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffd964aa940 a2=94 a3=6 items=0 ppid=4977 pid=4994 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:11.817000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 24 00:46:11.817000 audit: BPF prog-id=234 op=UNLOAD Jan 24 00:46:11.817000 audit[4994]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7ffd964aa940 a2=0 a3=6 items=0 ppid=4977 pid=4994 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:11.817000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 24 00:46:11.818000 audit: BPF prog-id=235 op=LOAD Jan 24 00:46:11.818000 audit[4994]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffd964aa0f0 a2=94 a3=88 items=0 ppid=4977 pid=4994 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:11.818000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 24 00:46:11.818000 audit: BPF prog-id=236 op=LOAD Jan 24 00:46:11.818000 audit[4994]: SYSCALL arch=c000003e syscall=321 success=yes exit=7 a0=5 a1=7ffd964a9f70 a2=94 a3=2 items=0 ppid=4977 pid=4994 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:11.818000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 24 00:46:11.818000 audit: BPF prog-id=236 op=UNLOAD Jan 24 00:46:11.818000 audit[4994]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=7 a1=7ffd964a9fa0 a2=0 a3=7ffd964aa0a0 items=0 ppid=4977 pid=4994 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:11.818000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 24 00:46:11.819000 audit: BPF prog-id=235 op=UNLOAD Jan 24 00:46:11.819000 audit[4994]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=3916d10 a2=0 a3=accdd6a732b3a3e6 items=0 ppid=4977 pid=4994 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:11.819000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 24 00:46:11.832000 audit: BPF prog-id=237 op=LOAD Jan 24 00:46:11.832000 audit[4997]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffe4864e450 a2=98 a3=1999999999999999 items=0 ppid=4977 pid=4997 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:11.832000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 24 00:46:11.832000 audit: BPF prog-id=237 op=UNLOAD Jan 24 00:46:11.832000 audit[4997]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7ffe4864e420 a3=0 items=0 ppid=4977 pid=4997 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:11.832000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 24 00:46:11.832000 audit: BPF prog-id=238 op=LOAD Jan 24 00:46:11.832000 audit[4997]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffe4864e330 a2=94 a3=ffff items=0 ppid=4977 pid=4997 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:11.832000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 24 00:46:11.833000 audit: BPF prog-id=238 op=UNLOAD Jan 24 00:46:11.833000 audit[4997]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffe4864e330 a2=94 a3=ffff items=0 ppid=4977 pid=4997 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:11.833000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 24 00:46:11.833000 audit: BPF prog-id=239 op=LOAD Jan 24 00:46:11.833000 audit[4997]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffe4864e370 a2=94 a3=7ffe4864e550 items=0 ppid=4977 pid=4997 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:11.833000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 24 00:46:11.834000 audit: BPF prog-id=239 op=UNLOAD Jan 24 00:46:11.834000 audit[4997]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffe4864e370 a2=94 a3=7ffe4864e550 items=0 ppid=4977 pid=4997 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:11.834000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 24 00:46:11.923046 systemd-networkd[1553]: vxlan.calico: Link UP Jan 24 00:46:11.923054 systemd-networkd[1553]: vxlan.calico: Gained carrier Jan 24 00:46:11.978000 audit: BPF prog-id=240 op=LOAD Jan 24 00:46:11.985152 kernel: kauditd_printk_skb: 220 callbacks suppressed Jan 24 00:46:11.985227 kernel: audit: type=1334 audit(1769215571.978:705): prog-id=240 op=LOAD Jan 24 00:46:11.978000 audit[5029]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffd11c7f830 a2=98 a3=0 items=0 ppid=4977 pid=5029 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:12.001934 kernel: audit: type=1300 audit(1769215571.978:705): arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffd11c7f830 a2=98 a3=0 items=0 ppid=4977 pid=5029 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:11.978000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 24 00:46:12.011189 kernel: audit: type=1327 audit(1769215571.978:705): proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 24 00:46:11.978000 audit: BPF prog-id=240 op=UNLOAD Jan 24 00:46:12.014198 kernel: audit: type=1334 audit(1769215571.978:706): prog-id=240 op=UNLOAD Jan 24 00:46:12.014323 kernel: audit: type=1300 audit(1769215571.978:706): arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7ffd11c7f800 a3=0 items=0 ppid=4977 pid=5029 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:11.978000 audit[5029]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7ffd11c7f800 a3=0 items=0 ppid=4977 pid=5029 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:12.025408 kernel: audit: type=1327 audit(1769215571.978:706): proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 24 00:46:11.978000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 24 00:46:12.027973 kernel: audit: type=1334 audit(1769215571.979:707): prog-id=241 op=LOAD Jan 24 00:46:11.979000 audit: BPF prog-id=241 op=LOAD Jan 24 00:46:11.979000 audit[5029]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffd11c7f640 a2=94 a3=54428f items=0 ppid=4977 pid=5029 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:12.039595 kernel: audit: type=1300 audit(1769215571.979:707): arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffd11c7f640 a2=94 a3=54428f items=0 ppid=4977 pid=5029 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:12.039647 kernel: audit: type=1327 audit(1769215571.979:707): proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 24 00:46:11.979000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 24 00:46:12.041718 kernel: audit: type=1334 audit(1769215571.979:708): prog-id=241 op=UNLOAD Jan 24 00:46:11.979000 audit: BPF prog-id=241 op=UNLOAD Jan 24 00:46:11.979000 audit[5029]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffd11c7f640 a2=94 a3=54428f items=0 ppid=4977 pid=5029 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:11.979000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 24 00:46:11.979000 audit: BPF prog-id=242 op=LOAD Jan 24 00:46:11.979000 audit[5029]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffd11c7f670 a2=94 a3=2 items=0 ppid=4977 pid=5029 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:11.979000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 24 00:46:11.979000 audit: BPF prog-id=242 op=UNLOAD Jan 24 00:46:11.979000 audit[5029]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffd11c7f670 a2=0 a3=2 items=0 ppid=4977 pid=5029 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:11.979000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 24 00:46:11.979000 audit: BPF prog-id=243 op=LOAD Jan 24 00:46:11.979000 audit[5029]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffd11c7f420 a2=94 a3=4 items=0 ppid=4977 pid=5029 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:11.979000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 24 00:46:11.979000 audit: BPF prog-id=243 op=UNLOAD Jan 24 00:46:11.979000 audit[5029]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7ffd11c7f420 a2=94 a3=4 items=0 ppid=4977 pid=5029 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:11.979000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 24 00:46:11.979000 audit: BPF prog-id=244 op=LOAD Jan 24 00:46:11.979000 audit[5029]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffd11c7f520 a2=94 a3=7ffd11c7f6a0 items=0 ppid=4977 pid=5029 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:11.979000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 24 00:46:11.979000 audit: BPF prog-id=244 op=UNLOAD Jan 24 00:46:11.979000 audit[5029]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7ffd11c7f520 a2=0 a3=7ffd11c7f6a0 items=0 ppid=4977 pid=5029 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:11.979000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 24 00:46:11.980000 audit: BPF prog-id=245 op=LOAD Jan 24 00:46:11.980000 audit[5029]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffd11c7ec50 a2=94 a3=2 items=0 ppid=4977 pid=5029 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:11.980000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 24 00:46:11.980000 audit: BPF prog-id=245 op=UNLOAD Jan 24 00:46:11.980000 audit[5029]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7ffd11c7ec50 a2=0 a3=2 items=0 ppid=4977 pid=5029 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:11.980000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 24 00:46:11.980000 audit: BPF prog-id=246 op=LOAD Jan 24 00:46:11.980000 audit[5029]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffd11c7ed50 a2=94 a3=30 items=0 ppid=4977 pid=5029 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:11.980000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 24 00:46:12.000000 audit: BPF prog-id=247 op=LOAD Jan 24 00:46:12.000000 audit[5031]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffdc47f2050 a2=98 a3=0 items=0 ppid=4977 pid=5031 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:12.000000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 24 00:46:12.000000 audit: BPF prog-id=247 op=UNLOAD Jan 24 00:46:12.000000 audit[5031]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7ffdc47f2020 a3=0 items=0 ppid=4977 pid=5031 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:12.000000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 24 00:46:12.000000 audit: BPF prog-id=248 op=LOAD Jan 24 00:46:12.000000 audit[5031]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffdc47f1e40 a2=94 a3=54428f items=0 ppid=4977 pid=5031 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:12.000000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 24 00:46:12.000000 audit: BPF prog-id=248 op=UNLOAD Jan 24 00:46:12.000000 audit[5031]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffdc47f1e40 a2=94 a3=54428f items=0 ppid=4977 pid=5031 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:12.000000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 24 00:46:12.042000 audit: BPF prog-id=249 op=LOAD Jan 24 00:46:12.042000 audit[5031]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffdc47f1e70 a2=94 a3=2 items=0 ppid=4977 pid=5031 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:12.042000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 24 00:46:12.043000 audit: BPF prog-id=249 op=UNLOAD Jan 24 00:46:12.043000 audit[5031]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffdc47f1e70 a2=0 a3=2 items=0 ppid=4977 pid=5031 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:12.043000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 24 00:46:12.208000 audit: BPF prog-id=250 op=LOAD Jan 24 00:46:12.208000 audit[5031]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffdc47f1d30 a2=94 a3=1 items=0 ppid=4977 pid=5031 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:12.208000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 24 00:46:12.211000 audit: BPF prog-id=250 op=UNLOAD Jan 24 00:46:12.211000 audit[5031]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffdc47f1d30 a2=94 a3=1 items=0 ppid=4977 pid=5031 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:12.211000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 24 00:46:12.219000 audit: BPF prog-id=251 op=LOAD Jan 24 00:46:12.219000 audit[5031]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffdc47f1d20 a2=94 a3=4 items=0 ppid=4977 pid=5031 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:12.219000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 24 00:46:12.219000 audit: BPF prog-id=251 op=UNLOAD Jan 24 00:46:12.219000 audit[5031]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7ffdc47f1d20 a2=0 a3=4 items=0 ppid=4977 pid=5031 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:12.219000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 24 00:46:12.219000 audit: BPF prog-id=252 op=LOAD Jan 24 00:46:12.219000 audit[5031]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffdc47f1b80 a2=94 a3=5 items=0 ppid=4977 pid=5031 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:12.219000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 24 00:46:12.219000 audit: BPF prog-id=252 op=UNLOAD Jan 24 00:46:12.219000 audit[5031]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7ffdc47f1b80 a2=0 a3=5 items=0 ppid=4977 pid=5031 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:12.219000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 24 00:46:12.220000 audit: BPF prog-id=253 op=LOAD Jan 24 00:46:12.220000 audit[5031]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffdc47f1da0 a2=94 a3=6 items=0 ppid=4977 pid=5031 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:12.220000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 24 00:46:12.220000 audit: BPF prog-id=253 op=UNLOAD Jan 24 00:46:12.220000 audit[5031]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7ffdc47f1da0 a2=0 a3=6 items=0 ppid=4977 pid=5031 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:12.220000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 24 00:46:12.220000 audit: BPF prog-id=254 op=LOAD Jan 24 00:46:12.220000 audit[5031]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffdc47f1550 a2=94 a3=88 items=0 ppid=4977 pid=5031 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:12.220000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 24 00:46:12.220000 audit: BPF prog-id=255 op=LOAD Jan 24 00:46:12.220000 audit[5031]: SYSCALL arch=c000003e syscall=321 success=yes exit=7 a0=5 a1=7ffdc47f13d0 a2=94 a3=2 items=0 ppid=4977 pid=5031 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:12.220000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 24 00:46:12.220000 audit: BPF prog-id=255 op=UNLOAD Jan 24 00:46:12.220000 audit[5031]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=7 a1=7ffdc47f1400 a2=0 a3=7ffdc47f1500 items=0 ppid=4977 pid=5031 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:12.220000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 24 00:46:12.221000 audit: BPF prog-id=254 op=UNLOAD Jan 24 00:46:12.221000 audit[5031]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=b810d10 a2=0 a3=20c63a5aed5b1367 items=0 ppid=4977 pid=5031 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:12.221000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 24 00:46:12.227000 audit: BPF prog-id=246 op=UNLOAD Jan 24 00:46:12.227000 audit[4977]: SYSCALL arch=c000003e syscall=263 success=yes exit=0 a0=ffffffffffffff9c a1=c000cd0200 a2=0 a3=0 items=0 ppid=4109 pid=4977 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="calico-node" exe="/usr/bin/calico-node" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:12.227000 audit: PROCTITLE proctitle=63616C69636F2D6E6F6465002D66656C6978 Jan 24 00:46:12.332000 audit[5064]: NETFILTER_CFG table=mangle:133 family=2 entries=16 op=nft_register_chain pid=5064 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 24 00:46:12.332000 audit[5064]: SYSCALL arch=c000003e syscall=46 success=yes exit=6868 a0=3 a1=7ffd474d26d0 a2=0 a3=7ffd474d26bc items=0 ppid=4977 pid=5064 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:12.332000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 24 00:46:12.339000 audit[5062]: NETFILTER_CFG table=raw:134 family=2 entries=21 op=nft_register_chain pid=5062 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 24 00:46:12.339000 audit[5062]: SYSCALL arch=c000003e syscall=46 success=yes exit=8452 a0=3 a1=7ffcfb209760 a2=0 a3=7ffcfb20974c items=0 ppid=4977 pid=5062 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:12.339000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 24 00:46:12.354000 audit[5068]: NETFILTER_CFG table=nat:135 family=2 entries=15 op=nft_register_chain pid=5068 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 24 00:46:12.354000 audit[5068]: SYSCALL arch=c000003e syscall=46 success=yes exit=5084 a0=3 a1=7ffdb4d6ac00 a2=0 a3=7ffdb4d6abec items=0 ppid=4977 pid=5068 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:12.354000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 24 00:46:12.359000 audit[5067]: NETFILTER_CFG table=filter:136 family=2 entries=327 op=nft_register_chain pid=5067 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 24 00:46:12.359000 audit[5067]: SYSCALL arch=c000003e syscall=46 success=yes exit=193468 a0=3 a1=7ffd4d7e9570 a2=0 a3=7ffd4d7e955c items=0 ppid=4977 pid=5067 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:46:12.359000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 24 00:46:13.308504 systemd-networkd[1553]: vxlan.calico: Gained IPv6LL Jan 24 00:46:16.618261 containerd[1682]: time="2026-01-24T00:46:16.617489046Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 24 00:46:17.070724 containerd[1682]: time="2026-01-24T00:46:17.070425441Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 24 00:46:17.072197 containerd[1682]: time="2026-01-24T00:46:17.072148333Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 24 00:46:17.072746 containerd[1682]: time="2026-01-24T00:46:17.072240013Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 24 00:46:17.073364 kubelet[2863]: E0124 00:46:17.073219 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 24 00:46:17.073364 kubelet[2863]: E0124 00:46:17.073283 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 24 00:46:17.076496 kubelet[2863]: E0124 00:46:17.073481 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:c13ad1f9d9cd499a81c814dc308eb491,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dlnt8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-58c54c478-wd6fd_calico-system(dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 24 00:46:17.079669 containerd[1682]: time="2026-01-24T00:46:17.079577366Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 24 00:46:17.503395 containerd[1682]: time="2026-01-24T00:46:17.503306745Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 24 00:46:17.505131 containerd[1682]: time="2026-01-24T00:46:17.504959266Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 24 00:46:17.505131 containerd[1682]: time="2026-01-24T00:46:17.505095906Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 24 00:46:17.505429 kubelet[2863]: E0124 00:46:17.505364 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 24 00:46:17.505538 kubelet[2863]: E0124 00:46:17.505434 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 24 00:46:17.505656 kubelet[2863]: E0124 00:46:17.505592 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dlnt8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-58c54c478-wd6fd_calico-system(dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 24 00:46:17.507205 kubelet[2863]: E0124 00:46:17.507124 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58c54c478-wd6fd" podUID="dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5" Jan 24 00:46:20.616720 containerd[1682]: time="2026-01-24T00:46:20.616647632Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 24 00:46:21.048982 containerd[1682]: time="2026-01-24T00:46:21.048799392Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 24 00:46:21.050498 containerd[1682]: time="2026-01-24T00:46:21.050411573Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 24 00:46:21.050599 containerd[1682]: time="2026-01-24T00:46:21.050514383Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 24 00:46:21.051127 kubelet[2863]: E0124 00:46:21.050845 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 24 00:46:21.051127 kubelet[2863]: E0124 00:46:21.050905 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 24 00:46:21.052968 kubelet[2863]: E0124 00:46:21.052221 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-llmb2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-7779cd58b4-jsxfd_calico-apiserver(7fb26181-9fdc-4f96-be2c-85fbaa5f21b7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 24 00:46:21.053928 containerd[1682]: time="2026-01-24T00:46:21.053420214Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 24 00:46:21.054166 kubelet[2863]: E0124 00:46:21.053869 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" podUID="7fb26181-9fdc-4f96-be2c-85fbaa5f21b7" Jan 24 00:46:21.481502 containerd[1682]: time="2026-01-24T00:46:21.481440978Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 24 00:46:21.483048 containerd[1682]: time="2026-01-24T00:46:21.482900789Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 24 00:46:21.483193 containerd[1682]: time="2026-01-24T00:46:21.482968379Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 24 00:46:21.483556 kubelet[2863]: E0124 00:46:21.483493 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 24 00:46:21.483623 kubelet[2863]: E0124 00:46:21.483554 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 24 00:46:21.483828 kubelet[2863]: E0124 00:46:21.483716 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f5fsn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-58fdcd774c-w2drb_calico-system(f546e732-cf0b-44c7-9678-ae1cb31a23a4): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 24 00:46:21.485724 kubelet[2863]: E0124 00:46:21.485540 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:46:21.617972 containerd[1682]: time="2026-01-24T00:46:21.617443026Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 24 00:46:22.035342 containerd[1682]: time="2026-01-24T00:46:22.035282305Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 24 00:46:22.037057 containerd[1682]: time="2026-01-24T00:46:22.036979986Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 24 00:46:22.037057 containerd[1682]: time="2026-01-24T00:46:22.037013096Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 24 00:46:22.037379 kubelet[2863]: E0124 00:46:22.037242 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 24 00:46:22.037379 kubelet[2863]: E0124 00:46:22.037296 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 24 00:46:22.038284 kubelet[2863]: E0124 00:46:22.038140 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fs8rv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-njf24_calico-system(12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 24 00:46:22.042033 containerd[1682]: time="2026-01-24T00:46:22.041926049Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 24 00:46:22.480216 containerd[1682]: time="2026-01-24T00:46:22.480128666Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 24 00:46:22.481892 containerd[1682]: time="2026-01-24T00:46:22.481802147Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 24 00:46:22.482003 containerd[1682]: time="2026-01-24T00:46:22.481913778Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 24 00:46:22.482302 kubelet[2863]: E0124 00:46:22.482210 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 24 00:46:22.482302 kubelet[2863]: E0124 00:46:22.482272 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 24 00:46:22.482939 kubelet[2863]: E0124 00:46:22.482442 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fs8rv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-njf24_calico-system(12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 24 00:46:22.483940 kubelet[2863]: E0124 00:46:22.483862 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:46:22.619106 containerd[1682]: time="2026-01-24T00:46:22.618532737Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 24 00:46:23.056656 containerd[1682]: time="2026-01-24T00:46:23.056566686Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 24 00:46:23.058314 containerd[1682]: time="2026-01-24T00:46:23.058250357Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 24 00:46:23.058412 containerd[1682]: time="2026-01-24T00:46:23.058350707Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 24 00:46:23.058775 kubelet[2863]: E0124 00:46:23.058620 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 24 00:46:23.058775 kubelet[2863]: E0124 00:46:23.058686 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 24 00:46:23.059655 kubelet[2863]: E0124 00:46:23.059515 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tm5dq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-7779cd58b4-xp2rb_calico-apiserver(d83d59e3-6296-40d9-bb63-5a69b654ac0c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 24 00:46:23.060985 containerd[1682]: time="2026-01-24T00:46:23.060930939Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 24 00:46:23.061478 kubelet[2863]: E0124 00:46:23.061415 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" podUID="d83d59e3-6296-40d9-bb63-5a69b654ac0c" Jan 24 00:46:23.482715 containerd[1682]: time="2026-01-24T00:46:23.482623202Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 24 00:46:23.484296 containerd[1682]: time="2026-01-24T00:46:23.484180393Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 24 00:46:23.484441 containerd[1682]: time="2026-01-24T00:46:23.484309774Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 24 00:46:23.484801 kubelet[2863]: E0124 00:46:23.484657 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 24 00:46:23.484801 kubelet[2863]: E0124 00:46:23.484712 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 24 00:46:23.486659 kubelet[2863]: E0124 00:46:23.485403 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w7bh8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-qhldj_calico-system(d0cc0ca8-4b85-478b-b9c2-c61b42d93c89): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 24 00:46:23.487004 kubelet[2863]: E0124 00:46:23.486669 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qhldj" podUID="d0cc0ca8-4b85-478b-b9c2-c61b42d93c89" Jan 24 00:46:28.619407 kubelet[2863]: E0124 00:46:28.619303 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58c54c478-wd6fd" podUID="dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5" Jan 24 00:46:32.618011 kubelet[2863]: E0124 00:46:32.617850 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" podUID="7fb26181-9fdc-4f96-be2c-85fbaa5f21b7" Jan 24 00:46:35.617218 kubelet[2863]: E0124 00:46:35.616629 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" podUID="d83d59e3-6296-40d9-bb63-5a69b654ac0c" Jan 24 00:46:36.617251 kubelet[2863]: E0124 00:46:36.616946 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:46:36.620764 kubelet[2863]: E0124 00:46:36.619726 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:46:38.616848 kubelet[2863]: E0124 00:46:38.616683 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qhldj" podUID="d0cc0ca8-4b85-478b-b9c2-c61b42d93c89" Jan 24 00:46:41.622422 containerd[1682]: time="2026-01-24T00:46:41.622288344Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 24 00:46:42.052743 containerd[1682]: time="2026-01-24T00:46:42.052506771Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 24 00:46:42.053996 containerd[1682]: time="2026-01-24T00:46:42.053965747Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 24 00:46:42.055089 containerd[1682]: time="2026-01-24T00:46:42.054086745Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 24 00:46:42.055199 kubelet[2863]: E0124 00:46:42.055170 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 24 00:46:42.055894 kubelet[2863]: E0124 00:46:42.055493 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 24 00:46:42.055894 kubelet[2863]: E0124 00:46:42.055598 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:c13ad1f9d9cd499a81c814dc308eb491,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dlnt8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-58c54c478-wd6fd_calico-system(dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 24 00:46:42.058964 containerd[1682]: time="2026-01-24T00:46:42.058884865Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 24 00:46:42.486190 containerd[1682]: time="2026-01-24T00:46:42.486049848Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 24 00:46:42.488046 containerd[1682]: time="2026-01-24T00:46:42.487959968Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 24 00:46:42.488293 containerd[1682]: time="2026-01-24T00:46:42.488107476Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 24 00:46:42.488544 kubelet[2863]: E0124 00:46:42.488469 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 24 00:46:42.488544 kubelet[2863]: E0124 00:46:42.488540 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 24 00:46:42.488799 kubelet[2863]: E0124 00:46:42.488723 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dlnt8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-58c54c478-wd6fd_calico-system(dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 24 00:46:42.490587 kubelet[2863]: E0124 00:46:42.490494 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58c54c478-wd6fd" podUID="dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5" Jan 24 00:46:46.616517 containerd[1682]: time="2026-01-24T00:46:46.616408464Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 24 00:46:47.033260 containerd[1682]: time="2026-01-24T00:46:47.033012799Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 24 00:46:47.034575 containerd[1682]: time="2026-01-24T00:46:47.034509016Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 24 00:46:47.034802 containerd[1682]: time="2026-01-24T00:46:47.034553806Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 24 00:46:47.035038 kubelet[2863]: E0124 00:46:47.034982 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 24 00:46:47.035038 kubelet[2863]: E0124 00:46:47.035023 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 24 00:46:47.035742 kubelet[2863]: E0124 00:46:47.035475 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tm5dq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-7779cd58b4-xp2rb_calico-apiserver(d83d59e3-6296-40d9-bb63-5a69b654ac0c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 24 00:46:47.036822 kubelet[2863]: E0124 00:46:47.036781 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" podUID="d83d59e3-6296-40d9-bb63-5a69b654ac0c" Jan 24 00:46:47.622328 containerd[1682]: time="2026-01-24T00:46:47.619223678Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 24 00:46:48.041221 containerd[1682]: time="2026-01-24T00:46:48.041082053Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 24 00:46:48.042578 containerd[1682]: time="2026-01-24T00:46:48.042505340Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 24 00:46:48.042578 containerd[1682]: time="2026-01-24T00:46:48.042562620Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 24 00:46:48.045247 kubelet[2863]: E0124 00:46:48.045204 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 24 00:46:48.045247 kubelet[2863]: E0124 00:46:48.045245 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 24 00:46:48.046316 kubelet[2863]: E0124 00:46:48.045426 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-llmb2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-7779cd58b4-jsxfd_calico-apiserver(7fb26181-9fdc-4f96-be2c-85fbaa5f21b7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 24 00:46:48.046495 containerd[1682]: time="2026-01-24T00:46:48.045798121Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 24 00:46:48.047026 kubelet[2863]: E0124 00:46:48.046686 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" podUID="7fb26181-9fdc-4f96-be2c-85fbaa5f21b7" Jan 24 00:46:48.464492 containerd[1682]: time="2026-01-24T00:46:48.464396270Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 24 00:46:48.466297 containerd[1682]: time="2026-01-24T00:46:48.466152494Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 24 00:46:48.466591 containerd[1682]: time="2026-01-24T00:46:48.466353853Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 24 00:46:48.466831 kubelet[2863]: E0124 00:46:48.466785 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 24 00:46:48.466992 kubelet[2863]: E0124 00:46:48.466906 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 24 00:46:48.467488 kubelet[2863]: E0124 00:46:48.467424 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f5fsn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-58fdcd774c-w2drb_calico-system(f546e732-cf0b-44c7-9678-ae1cb31a23a4): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 24 00:46:48.468711 kubelet[2863]: E0124 00:46:48.468683 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:46:48.619290 containerd[1682]: time="2026-01-24T00:46:48.619126319Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 24 00:46:49.062129 containerd[1682]: time="2026-01-24T00:46:49.062016488Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 24 00:46:49.063879 containerd[1682]: time="2026-01-24T00:46:49.063732162Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 24 00:46:49.063879 containerd[1682]: time="2026-01-24T00:46:49.063801442Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 24 00:46:49.064405 kubelet[2863]: E0124 00:46:49.064314 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 24 00:46:49.064405 kubelet[2863]: E0124 00:46:49.064387 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 24 00:46:49.065562 kubelet[2863]: E0124 00:46:49.064553 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fs8rv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-njf24_calico-system(12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 24 00:46:49.067432 containerd[1682]: time="2026-01-24T00:46:49.067047684Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 24 00:46:49.492335 containerd[1682]: time="2026-01-24T00:46:49.492190177Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 24 00:46:49.494512 containerd[1682]: time="2026-01-24T00:46:49.494218540Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 24 00:46:49.494512 containerd[1682]: time="2026-01-24T00:46:49.494333249Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 24 00:46:49.494744 kubelet[2863]: E0124 00:46:49.494603 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 24 00:46:49.494984 kubelet[2863]: E0124 00:46:49.494949 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 24 00:46:49.496102 kubelet[2863]: E0124 00:46:49.495420 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fs8rv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-njf24_calico-system(12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 24 00:46:49.497169 kubelet[2863]: E0124 00:46:49.497115 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:46:51.618740 containerd[1682]: time="2026-01-24T00:46:51.618708505Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 24 00:46:52.050327 containerd[1682]: time="2026-01-24T00:46:52.050208991Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 24 00:46:52.051438 containerd[1682]: time="2026-01-24T00:46:52.051413222Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 24 00:46:52.051496 containerd[1682]: time="2026-01-24T00:46:52.051479661Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 24 00:46:52.051656 kubelet[2863]: E0124 00:46:52.051601 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 24 00:46:52.051967 kubelet[2863]: E0124 00:46:52.051672 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 24 00:46:52.051967 kubelet[2863]: E0124 00:46:52.051803 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w7bh8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-qhldj_calico-system(d0cc0ca8-4b85-478b-b9c2-c61b42d93c89): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 24 00:46:52.052975 kubelet[2863]: E0124 00:46:52.052946 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qhldj" podUID="d0cc0ca8-4b85-478b-b9c2-c61b42d93c89" Jan 24 00:46:55.619040 kubelet[2863]: E0124 00:46:55.618610 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58c54c478-wd6fd" podUID="dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5" Jan 24 00:46:57.619493 kubelet[2863]: E0124 00:46:57.619106 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" podUID="d83d59e3-6296-40d9-bb63-5a69b654ac0c" Jan 24 00:46:58.617316 kubelet[2863]: E0124 00:46:58.617243 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" podUID="7fb26181-9fdc-4f96-be2c-85fbaa5f21b7" Jan 24 00:47:02.620619 kubelet[2863]: E0124 00:47:02.620553 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:47:03.619212 kubelet[2863]: E0124 00:47:03.618926 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:47:06.615733 kubelet[2863]: E0124 00:47:06.615692 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qhldj" podUID="d0cc0ca8-4b85-478b-b9c2-c61b42d93c89" Jan 24 00:47:07.112922 systemd[1]: Started sshd@7-65.109.167.77:22-4.153.228.146:43098.service - OpenSSH per-connection server daemon (4.153.228.146:43098). Jan 24 00:47:07.130578 kernel: kauditd_printk_skb: 98 callbacks suppressed Jan 24 00:47:07.130737 kernel: audit: type=1130 audit(1769215627.112:741): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-65.109.167.77:22-4.153.228.146:43098 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:47:07.112000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-65.109.167.77:22-4.153.228.146:43098 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:47:07.822000 audit[5178]: USER_ACCT pid=5178 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:07.827229 sshd-session[5178]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:47:07.829677 sshd[5178]: Accepted publickey for core from 4.153.228.146 port 43098 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:47:07.830271 kernel: audit: type=1101 audit(1769215627.822:742): pid=5178 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:07.824000 audit[5178]: CRED_ACQ pid=5178 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:07.841113 kernel: audit: type=1103 audit(1769215627.824:743): pid=5178 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:07.841160 kernel: audit: type=1006 audit(1769215627.824:744): pid=5178 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=9 res=1 Jan 24 00:47:07.824000 audit[5178]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffcbec76da0 a2=3 a3=0 items=0 ppid=1 pid=5178 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=9 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:47:07.852057 kernel: audit: type=1300 audit(1769215627.824:744): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffcbec76da0 a2=3 a3=0 items=0 ppid=1 pid=5178 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=9 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:47:07.852160 kernel: audit: type=1327 audit(1769215627.824:744): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:47:07.824000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:47:07.856093 systemd-logind[1656]: New session 9 of user core. Jan 24 00:47:07.858242 systemd[1]: Started session-9.scope - Session 9 of User core. Jan 24 00:47:07.861000 audit[5178]: USER_START pid=5178 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:07.867000 audit[5182]: CRED_ACQ pid=5182 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:07.871204 kernel: audit: type=1105 audit(1769215627.861:745): pid=5178 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:07.871281 kernel: audit: type=1103 audit(1769215627.867:746): pid=5182 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:08.356887 sshd[5182]: Connection closed by 4.153.228.146 port 43098 Jan 24 00:47:08.358283 sshd-session[5178]: pam_unix(sshd:session): session closed for user core Jan 24 00:47:08.360000 audit[5178]: USER_END pid=5178 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:08.361000 audit[5178]: CRED_DISP pid=5178 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:08.372204 systemd[1]: sshd@7-65.109.167.77:22-4.153.228.146:43098.service: Deactivated successfully. Jan 24 00:47:08.384233 kernel: audit: type=1106 audit(1769215628.360:747): pid=5178 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:08.384314 kernel: audit: type=1104 audit(1769215628.361:748): pid=5178 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:08.376917 systemd[1]: session-9.scope: Deactivated successfully. Jan 24 00:47:08.379287 systemd-logind[1656]: Session 9 logged out. Waiting for processes to exit. Jan 24 00:47:08.387629 systemd-logind[1656]: Removed session 9. Jan 24 00:47:08.372000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-65.109.167.77:22-4.153.228.146:43098 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:47:10.620118 kubelet[2863]: E0124 00:47:10.620011 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58c54c478-wd6fd" podUID="dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5" Jan 24 00:47:11.620313 kubelet[2863]: E0124 00:47:11.620112 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" podUID="d83d59e3-6296-40d9-bb63-5a69b654ac0c" Jan 24 00:47:11.621323 kubelet[2863]: E0124 00:47:11.621203 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" podUID="7fb26181-9fdc-4f96-be2c-85fbaa5f21b7" Jan 24 00:47:13.501911 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:47:13.502151 kernel: audit: type=1130 audit(1769215633.496:750): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-65.109.167.77:22-4.153.228.146:43100 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:47:13.496000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-65.109.167.77:22-4.153.228.146:43100 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:47:13.496595 systemd[1]: Started sshd@8-65.109.167.77:22-4.153.228.146:43100.service - OpenSSH per-connection server daemon (4.153.228.146:43100). Jan 24 00:47:14.234417 sshd[5198]: Accepted publickey for core from 4.153.228.146 port 43100 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:47:14.233000 audit[5198]: USER_ACCT pid=5198 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:14.238335 sshd-session[5198]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:47:14.241302 kernel: audit: type=1101 audit(1769215634.233:751): pid=5198 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:14.235000 audit[5198]: CRED_ACQ pid=5198 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:14.248151 kernel: audit: type=1103 audit(1769215634.235:752): pid=5198 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:14.260044 kernel: audit: type=1006 audit(1769215634.235:753): pid=5198 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=10 res=1 Jan 24 00:47:14.260146 kernel: audit: type=1300 audit(1769215634.235:753): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd73365790 a2=3 a3=0 items=0 ppid=1 pid=5198 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=10 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:47:14.235000 audit[5198]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd73365790 a2=3 a3=0 items=0 ppid=1 pid=5198 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=10 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:47:14.235000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:47:14.264084 kernel: audit: type=1327 audit(1769215634.235:753): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:47:14.266357 systemd-logind[1656]: New session 10 of user core. Jan 24 00:47:14.269399 systemd[1]: Started session-10.scope - Session 10 of User core. Jan 24 00:47:14.278000 audit[5198]: USER_START pid=5198 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:14.287146 kernel: audit: type=1105 audit(1769215634.278:754): pid=5198 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:14.286000 audit[5202]: CRED_ACQ pid=5202 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:14.294110 kernel: audit: type=1103 audit(1769215634.286:755): pid=5202 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:14.663252 sshd[5202]: Connection closed by 4.153.228.146 port 43100 Jan 24 00:47:14.662254 sshd-session[5198]: pam_unix(sshd:session): session closed for user core Jan 24 00:47:14.664000 audit[5198]: USER_END pid=5198 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:14.668446 systemd[1]: sshd@8-65.109.167.77:22-4.153.228.146:43100.service: Deactivated successfully. Jan 24 00:47:14.670561 systemd[1]: session-10.scope: Deactivated successfully. Jan 24 00:47:14.672869 systemd-logind[1656]: Session 10 logged out. Waiting for processes to exit. Jan 24 00:47:14.673789 systemd-logind[1656]: Removed session 10. Jan 24 00:47:14.683116 kernel: audit: type=1106 audit(1769215634.664:756): pid=5198 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:14.664000 audit[5198]: CRED_DISP pid=5198 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:14.667000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-65.109.167.77:22-4.153.228.146:43100 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:47:14.697138 kernel: audit: type=1104 audit(1769215634.664:757): pid=5198 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:17.617015 kubelet[2863]: E0124 00:47:17.616941 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:47:18.616724 kubelet[2863]: E0124 00:47:18.615970 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:47:19.620542 kubelet[2863]: E0124 00:47:19.620472 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qhldj" podUID="d0cc0ca8-4b85-478b-b9c2-c61b42d93c89" Jan 24 00:47:19.794281 systemd[1]: Started sshd@9-65.109.167.77:22-4.153.228.146:55832.service - OpenSSH per-connection server daemon (4.153.228.146:55832). Jan 24 00:47:19.797332 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:47:19.797396 kernel: audit: type=1130 audit(1769215639.793:759): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-65.109.167.77:22-4.153.228.146:55832 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:47:19.793000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-65.109.167.77:22-4.153.228.146:55832 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:47:20.507000 audit[5216]: USER_ACCT pid=5216 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:20.510162 sshd[5216]: Accepted publickey for core from 4.153.228.146 port 55832 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:47:20.513719 sshd-session[5216]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:47:20.511000 audit[5216]: CRED_ACQ pid=5216 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:20.517156 kernel: audit: type=1101 audit(1769215640.507:760): pid=5216 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:20.517203 kernel: audit: type=1103 audit(1769215640.511:761): pid=5216 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:20.522406 kernel: audit: type=1006 audit(1769215640.511:762): pid=5216 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=11 res=1 Jan 24 00:47:20.523238 systemd-logind[1656]: New session 11 of user core. Jan 24 00:47:20.511000 audit[5216]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffeb5bf08b0 a2=3 a3=0 items=0 ppid=1 pid=5216 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:47:20.527285 kernel: audit: type=1300 audit(1769215640.511:762): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffeb5bf08b0 a2=3 a3=0 items=0 ppid=1 pid=5216 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:47:20.511000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:47:20.532538 kernel: audit: type=1327 audit(1769215640.511:762): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:47:20.535201 systemd[1]: Started session-11.scope - Session 11 of User core. Jan 24 00:47:20.539000 audit[5216]: USER_START pid=5216 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:20.548084 kernel: audit: type=1105 audit(1769215640.539:763): pid=5216 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:20.547000 audit[5220]: CRED_ACQ pid=5220 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:20.554092 kernel: audit: type=1103 audit(1769215640.547:764): pid=5220 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:21.018317 sshd[5220]: Connection closed by 4.153.228.146 port 55832 Jan 24 00:47:21.021362 sshd-session[5216]: pam_unix(sshd:session): session closed for user core Jan 24 00:47:21.043254 kernel: audit: type=1106 audit(1769215641.023:765): pid=5216 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:21.023000 audit[5216]: USER_END pid=5216 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:21.030021 systemd[1]: sshd@9-65.109.167.77:22-4.153.228.146:55832.service: Deactivated successfully. Jan 24 00:47:21.034826 systemd[1]: session-11.scope: Deactivated successfully. Jan 24 00:47:21.038144 systemd-logind[1656]: Session 11 logged out. Waiting for processes to exit. Jan 24 00:47:21.043756 systemd-logind[1656]: Removed session 11. Jan 24 00:47:21.060024 kernel: audit: type=1104 audit(1769215641.024:766): pid=5216 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:21.024000 audit[5216]: CRED_DISP pid=5216 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:21.029000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-65.109.167.77:22-4.153.228.146:55832 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:47:23.618259 kubelet[2863]: E0124 00:47:23.617948 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" podUID="d83d59e3-6296-40d9-bb63-5a69b654ac0c" Jan 24 00:47:25.617563 kubelet[2863]: E0124 00:47:25.617516 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" podUID="7fb26181-9fdc-4f96-be2c-85fbaa5f21b7" Jan 24 00:47:25.618785 containerd[1682]: time="2026-01-24T00:47:25.618532224Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 24 00:47:26.039799 containerd[1682]: time="2026-01-24T00:47:26.039441746Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 24 00:47:26.041460 containerd[1682]: time="2026-01-24T00:47:26.041375118Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 24 00:47:26.041571 containerd[1682]: time="2026-01-24T00:47:26.041540048Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 24 00:47:26.041953 kubelet[2863]: E0124 00:47:26.041882 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 24 00:47:26.042038 kubelet[2863]: E0124 00:47:26.041995 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 24 00:47:26.042864 kubelet[2863]: E0124 00:47:26.042391 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:c13ad1f9d9cd499a81c814dc308eb491,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dlnt8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-58c54c478-wd6fd_calico-system(dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 24 00:47:26.045384 containerd[1682]: time="2026-01-24T00:47:26.045320615Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 24 00:47:26.151763 systemd[1]: Started sshd@10-65.109.167.77:22-4.153.228.146:48872.service - OpenSSH per-connection server daemon (4.153.228.146:48872). Jan 24 00:47:26.156147 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:47:26.156430 kernel: audit: type=1130 audit(1769215646.151:768): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-65.109.167.77:22-4.153.228.146:48872 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:47:26.151000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-65.109.167.77:22-4.153.228.146:48872 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:47:26.644615 containerd[1682]: time="2026-01-24T00:47:26.644187806Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 24 00:47:26.646529 containerd[1682]: time="2026-01-24T00:47:26.645552633Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 24 00:47:26.646646 containerd[1682]: time="2026-01-24T00:47:26.645604872Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 24 00:47:26.646824 kubelet[2863]: E0124 00:47:26.646791 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 24 00:47:26.647046 kubelet[2863]: E0124 00:47:26.646836 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 24 00:47:26.647046 kubelet[2863]: E0124 00:47:26.646932 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dlnt8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-58c54c478-wd6fd_calico-system(dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 24 00:47:26.648346 kubelet[2863]: E0124 00:47:26.648321 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58c54c478-wd6fd" podUID="dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5" Jan 24 00:47:26.811049 sshd[5235]: Accepted publickey for core from 4.153.228.146 port 48872 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:47:26.809000 audit[5235]: USER_ACCT pid=5235 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:26.814504 sshd-session[5235]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:47:26.824121 kernel: audit: type=1101 audit(1769215646.809:769): pid=5235 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:26.824180 kernel: audit: type=1103 audit(1769215646.811:770): pid=5235 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:26.811000 audit[5235]: CRED_ACQ pid=5235 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:26.822573 systemd-logind[1656]: New session 12 of user core. Jan 24 00:47:26.827766 kernel: audit: type=1006 audit(1769215646.811:771): pid=5235 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=12 res=1 Jan 24 00:47:26.811000 audit[5235]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc1ab4f2b0 a2=3 a3=0 items=0 ppid=1 pid=5235 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=12 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:47:26.834839 kernel: audit: type=1300 audit(1769215646.811:771): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc1ab4f2b0 a2=3 a3=0 items=0 ppid=1 pid=5235 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=12 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:47:26.834259 systemd[1]: Started session-12.scope - Session 12 of User core. Jan 24 00:47:26.838481 kernel: audit: type=1327 audit(1769215646.811:771): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:47:26.811000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:47:26.841000 audit[5235]: USER_START pid=5235 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:26.849829 kernel: audit: type=1105 audit(1769215646.841:772): pid=5235 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:26.849000 audit[5239]: CRED_ACQ pid=5239 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:26.857088 kernel: audit: type=1103 audit(1769215646.849:773): pid=5239 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:27.242332 sshd[5239]: Connection closed by 4.153.228.146 port 48872 Jan 24 00:47:27.243252 sshd-session[5235]: pam_unix(sshd:session): session closed for user core Jan 24 00:47:27.252147 kernel: audit: type=1106 audit(1769215647.243:774): pid=5235 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:27.243000 audit[5235]: USER_END pid=5235 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:27.247020 systemd-logind[1656]: Session 12 logged out. Waiting for processes to exit. Jan 24 00:47:27.248902 systemd[1]: sshd@10-65.109.167.77:22-4.153.228.146:48872.service: Deactivated successfully. Jan 24 00:47:27.251097 systemd[1]: session-12.scope: Deactivated successfully. Jan 24 00:47:27.244000 audit[5235]: CRED_DISP pid=5235 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:27.253475 systemd-logind[1656]: Removed session 12. Jan 24 00:47:27.246000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-65.109.167.77:22-4.153.228.146:48872 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:47:27.261101 kernel: audit: type=1104 audit(1769215647.244:775): pid=5235 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:28.617395 kubelet[2863]: E0124 00:47:28.617346 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:47:32.406309 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:47:32.406451 kernel: audit: type=1130 audit(1769215652.385:777): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-65.109.167.77:22-4.153.228.146:48874 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:47:32.385000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-65.109.167.77:22-4.153.228.146:48874 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:47:32.386736 systemd[1]: Started sshd@11-65.109.167.77:22-4.153.228.146:48874.service - OpenSSH per-connection server daemon (4.153.228.146:48874). Jan 24 00:47:33.076701 sshd[5258]: Accepted publickey for core from 4.153.228.146 port 48874 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:47:33.075000 audit[5258]: USER_ACCT pid=5258 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:33.079108 sshd-session[5258]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:47:33.084247 kernel: audit: type=1101 audit(1769215653.075:778): pid=5258 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:33.076000 audit[5258]: CRED_ACQ pid=5258 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:33.089770 systemd-logind[1656]: New session 13 of user core. Jan 24 00:47:33.093244 kernel: audit: type=1103 audit(1769215653.076:779): pid=5258 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:33.076000 audit[5258]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffff8388490 a2=3 a3=0 items=0 ppid=1 pid=5258 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:47:33.099115 kernel: audit: type=1006 audit(1769215653.076:780): pid=5258 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=13 res=1 Jan 24 00:47:33.099153 kernel: audit: type=1300 audit(1769215653.076:780): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffff8388490 a2=3 a3=0 items=0 ppid=1 pid=5258 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:47:33.098234 systemd[1]: Started session-13.scope - Session 13 of User core. Jan 24 00:47:33.076000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:47:33.104708 kernel: audit: type=1327 audit(1769215653.076:780): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:47:33.106000 audit[5258]: USER_START pid=5258 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:33.108859 kernel: audit: type=1105 audit(1769215653.106:781): pid=5258 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:33.115074 kernel: audit: type=1103 audit(1769215653.113:782): pid=5291 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:33.113000 audit[5291]: CRED_ACQ pid=5291 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:33.540178 sshd[5291]: Connection closed by 4.153.228.146 port 48874 Jan 24 00:47:33.540634 sshd-session[5258]: pam_unix(sshd:session): session closed for user core Jan 24 00:47:33.541000 audit[5258]: USER_END pid=5258 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:33.544560 systemd-logind[1656]: Session 13 logged out. Waiting for processes to exit. Jan 24 00:47:33.545264 systemd[1]: sshd@11-65.109.167.77:22-4.153.228.146:48874.service: Deactivated successfully. Jan 24 00:47:33.547509 systemd[1]: session-13.scope: Deactivated successfully. Jan 24 00:47:33.550207 systemd-logind[1656]: Removed session 13. Jan 24 00:47:33.560095 kernel: audit: type=1106 audit(1769215653.541:783): pid=5258 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:33.541000 audit[5258]: CRED_DISP pid=5258 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:33.544000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-65.109.167.77:22-4.153.228.146:48874 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:47:33.572473 kernel: audit: type=1104 audit(1769215653.541:784): pid=5258 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:33.618574 containerd[1682]: time="2026-01-24T00:47:33.618487234Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 24 00:47:34.041417 containerd[1682]: time="2026-01-24T00:47:34.041321324Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 24 00:47:34.043216 containerd[1682]: time="2026-01-24T00:47:34.043126958Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 24 00:47:34.043307 containerd[1682]: time="2026-01-24T00:47:34.043229798Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 24 00:47:34.043485 kubelet[2863]: E0124 00:47:34.043429 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 24 00:47:34.044055 kubelet[2863]: E0124 00:47:34.043488 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 24 00:47:34.044055 kubelet[2863]: E0124 00:47:34.043838 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f5fsn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-58fdcd774c-w2drb_calico-system(f546e732-cf0b-44c7-9678-ae1cb31a23a4): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 24 00:47:34.045522 containerd[1682]: time="2026-01-24T00:47:34.045135502Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 24 00:47:34.045637 kubelet[2863]: E0124 00:47:34.045328 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:47:34.478825 containerd[1682]: time="2026-01-24T00:47:34.478713592Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 24 00:47:34.480564 containerd[1682]: time="2026-01-24T00:47:34.480509498Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 24 00:47:34.480759 containerd[1682]: time="2026-01-24T00:47:34.480609587Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 24 00:47:34.480907 kubelet[2863]: E0124 00:47:34.480832 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 24 00:47:34.481829 kubelet[2863]: E0124 00:47:34.481508 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 24 00:47:34.482148 kubelet[2863]: E0124 00:47:34.481997 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w7bh8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-qhldj_calico-system(d0cc0ca8-4b85-478b-b9c2-c61b42d93c89): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 24 00:47:34.483786 kubelet[2863]: E0124 00:47:34.483708 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qhldj" podUID="d0cc0ca8-4b85-478b-b9c2-c61b42d93c89" Jan 24 00:47:38.617392 containerd[1682]: time="2026-01-24T00:47:38.616963037Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 24 00:47:38.687000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-65.109.167.77:22-4.153.228.146:43222 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:47:38.687196 systemd[1]: Started sshd@12-65.109.167.77:22-4.153.228.146:43222.service - OpenSSH per-connection server daemon (4.153.228.146:43222). Jan 24 00:47:38.703524 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:47:38.703634 kernel: audit: type=1130 audit(1769215658.687:786): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-65.109.167.77:22-4.153.228.146:43222 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:47:39.060911 containerd[1682]: time="2026-01-24T00:47:39.060432465Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 24 00:47:39.061760 containerd[1682]: time="2026-01-24T00:47:39.061672962Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 24 00:47:39.061760 containerd[1682]: time="2026-01-24T00:47:39.061725791Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 24 00:47:39.061990 kubelet[2863]: E0124 00:47:39.061937 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 24 00:47:39.061990 kubelet[2863]: E0124 00:47:39.061980 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 24 00:47:39.063485 kubelet[2863]: E0124 00:47:39.062098 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tm5dq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-7779cd58b4-xp2rb_calico-apiserver(d83d59e3-6296-40d9-bb63-5a69b654ac0c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 24 00:47:39.064214 kubelet[2863]: E0124 00:47:39.063783 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" podUID="d83d59e3-6296-40d9-bb63-5a69b654ac0c" Jan 24 00:47:39.393000 audit[5304]: USER_ACCT pid=5304 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:39.407109 kernel: audit: type=1101 audit(1769215659.393:787): pid=5304 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:39.407196 sshd[5304]: Accepted publickey for core from 4.153.228.146 port 43222 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:47:39.411011 sshd-session[5304]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:47:39.422138 kernel: audit: type=1103 audit(1769215659.408:788): pid=5304 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:39.408000 audit[5304]: CRED_ACQ pid=5304 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:39.427115 kernel: audit: type=1006 audit(1769215659.408:789): pid=5304 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=14 res=1 Jan 24 00:47:39.408000 audit[5304]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff67cb4af0 a2=3 a3=0 items=0 ppid=1 pid=5304 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=14 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:47:39.435427 kernel: audit: type=1300 audit(1769215659.408:789): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff67cb4af0 a2=3 a3=0 items=0 ppid=1 pid=5304 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=14 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:47:39.408000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:47:39.439121 kernel: audit: type=1327 audit(1769215659.408:789): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:47:39.440955 systemd-logind[1656]: New session 14 of user core. Jan 24 00:47:39.447228 systemd[1]: Started session-14.scope - Session 14 of User core. Jan 24 00:47:39.450000 audit[5304]: USER_START pid=5304 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:39.465087 kernel: audit: type=1105 audit(1769215659.450:790): pid=5304 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:39.465152 kernel: audit: type=1103 audit(1769215659.458:791): pid=5308 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:39.458000 audit[5308]: CRED_ACQ pid=5308 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:39.622093 containerd[1682]: time="2026-01-24T00:47:39.621085647Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 24 00:47:39.628259 kubelet[2863]: E0124 00:47:39.628176 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58c54c478-wd6fd" podUID="dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5" Jan 24 00:47:39.855813 sshd[5308]: Connection closed by 4.153.228.146 port 43222 Jan 24 00:47:39.856981 sshd-session[5304]: pam_unix(sshd:session): session closed for user core Jan 24 00:47:39.858000 audit[5304]: USER_END pid=5304 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:39.862112 systemd[1]: sshd@12-65.109.167.77:22-4.153.228.146:43222.service: Deactivated successfully. Jan 24 00:47:39.865094 systemd[1]: session-14.scope: Deactivated successfully. Jan 24 00:47:39.868637 kernel: audit: type=1106 audit(1769215659.858:792): pid=5304 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:39.868033 systemd-logind[1656]: Session 14 logged out. Waiting for processes to exit. Jan 24 00:47:39.869114 systemd-logind[1656]: Removed session 14. Jan 24 00:47:39.858000 audit[5304]: CRED_DISP pid=5304 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:39.876094 kernel: audit: type=1104 audit(1769215659.858:793): pid=5304 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:39.861000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-65.109.167.77:22-4.153.228.146:43222 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:47:40.042112 containerd[1682]: time="2026-01-24T00:47:40.042070827Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 24 00:47:40.043573 containerd[1682]: time="2026-01-24T00:47:40.043518753Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 24 00:47:40.043675 containerd[1682]: time="2026-01-24T00:47:40.043585913Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 24 00:47:40.044114 kubelet[2863]: E0124 00:47:40.044040 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 24 00:47:40.044159 kubelet[2863]: E0124 00:47:40.044127 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 24 00:47:40.044275 kubelet[2863]: E0124 00:47:40.044242 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fs8rv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-njf24_calico-system(12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 24 00:47:40.046580 containerd[1682]: time="2026-01-24T00:47:40.046446165Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 24 00:47:40.482934 containerd[1682]: time="2026-01-24T00:47:40.482814203Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 24 00:47:40.484309 containerd[1682]: time="2026-01-24T00:47:40.484220779Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 24 00:47:40.484399 containerd[1682]: time="2026-01-24T00:47:40.484332729Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 24 00:47:40.484687 kubelet[2863]: E0124 00:47:40.484616 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 24 00:47:40.485951 kubelet[2863]: E0124 00:47:40.484688 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 24 00:47:40.485951 kubelet[2863]: E0124 00:47:40.484936 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fs8rv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-njf24_calico-system(12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 24 00:47:40.486765 kubelet[2863]: E0124 00:47:40.486653 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:47:40.617378 containerd[1682]: time="2026-01-24T00:47:40.617277358Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 24 00:47:41.045867 containerd[1682]: time="2026-01-24T00:47:41.045700978Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 24 00:47:41.046931 containerd[1682]: time="2026-01-24T00:47:41.046843655Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 24 00:47:41.046931 containerd[1682]: time="2026-01-24T00:47:41.046901655Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 24 00:47:41.047256 kubelet[2863]: E0124 00:47:41.047221 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 24 00:47:41.047296 kubelet[2863]: E0124 00:47:41.047263 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 24 00:47:41.047577 kubelet[2863]: E0124 00:47:41.047391 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-llmb2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-7779cd58b4-jsxfd_calico-apiserver(7fb26181-9fdc-4f96-be2c-85fbaa5f21b7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 24 00:47:41.048899 kubelet[2863]: E0124 00:47:41.048879 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" podUID="7fb26181-9fdc-4f96-be2c-85fbaa5f21b7" Jan 24 00:47:44.998910 systemd[1]: Started sshd@13-65.109.167.77:22-4.153.228.146:54802.service - OpenSSH per-connection server daemon (4.153.228.146:54802). Jan 24 00:47:44.998000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-65.109.167.77:22-4.153.228.146:54802 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:47:45.019348 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:47:45.019506 kernel: audit: type=1130 audit(1769215664.998:795): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-65.109.167.77:22-4.153.228.146:54802 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:47:45.695000 audit[5320]: USER_ACCT pid=5320 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:45.703240 sshd[5320]: Accepted publickey for core from 4.153.228.146 port 54802 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:47:45.709338 kernel: audit: type=1101 audit(1769215665.695:796): pid=5320 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:45.709455 kernel: audit: type=1103 audit(1769215665.702:797): pid=5320 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:45.702000 audit[5320]: CRED_ACQ pid=5320 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:45.705766 sshd-session[5320]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:47:45.714073 kernel: audit: type=1006 audit(1769215665.702:798): pid=5320 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=15 res=1 Jan 24 00:47:45.702000 audit[5320]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffdd1998520 a2=3 a3=0 items=0 ppid=1 pid=5320 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=15 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:47:45.722090 kernel: audit: type=1300 audit(1769215665.702:798): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffdd1998520 a2=3 a3=0 items=0 ppid=1 pid=5320 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=15 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:47:45.722809 systemd-logind[1656]: New session 15 of user core. Jan 24 00:47:45.702000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:47:45.728094 kernel: audit: type=1327 audit(1769215665.702:798): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:47:45.730438 systemd[1]: Started session-15.scope - Session 15 of User core. Jan 24 00:47:45.741501 kernel: audit: type=1105 audit(1769215665.732:799): pid=5320 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:45.732000 audit[5320]: USER_START pid=5320 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:45.743000 audit[5324]: CRED_ACQ pid=5324 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:45.750090 kernel: audit: type=1103 audit(1769215665.743:800): pid=5324 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:46.133414 sshd[5324]: Connection closed by 4.153.228.146 port 54802 Jan 24 00:47:46.133947 sshd-session[5320]: pam_unix(sshd:session): session closed for user core Jan 24 00:47:46.133000 audit[5320]: USER_END pid=5320 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:46.142105 kernel: audit: type=1106 audit(1769215666.133:801): pid=5320 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:46.134000 audit[5320]: CRED_DISP pid=5320 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:46.146707 systemd[1]: sshd@13-65.109.167.77:22-4.153.228.146:54802.service: Deactivated successfully. Jan 24 00:47:46.143000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-65.109.167.77:22-4.153.228.146:54802 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:47:46.148140 kernel: audit: type=1104 audit(1769215666.134:802): pid=5320 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:46.149795 systemd[1]: session-15.scope: Deactivated successfully. Jan 24 00:47:46.152826 systemd-logind[1656]: Session 15 logged out. Waiting for processes to exit. Jan 24 00:47:46.155810 systemd-logind[1656]: Removed session 15. Jan 24 00:47:46.618090 kubelet[2863]: E0124 00:47:46.617457 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:47:47.622955 kubelet[2863]: E0124 00:47:47.622914 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qhldj" podUID="d0cc0ca8-4b85-478b-b9c2-c61b42d93c89" Jan 24 00:47:49.618170 kubelet[2863]: E0124 00:47:49.617763 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" podUID="d83d59e3-6296-40d9-bb63-5a69b654ac0c" Jan 24 00:47:51.286299 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:47:51.286405 kernel: audit: type=1130 audit(1769215671.276:804): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-65.109.167.77:22-4.153.228.146:54816 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:47:51.276000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-65.109.167.77:22-4.153.228.146:54816 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:47:51.276545 systemd[1]: Started sshd@14-65.109.167.77:22-4.153.228.146:54816.service - OpenSSH per-connection server daemon (4.153.228.146:54816). Jan 24 00:47:51.619665 kubelet[2863]: E0124 00:47:51.619507 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:47:51.956000 audit[5358]: USER_ACCT pid=5358 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:51.960622 sshd-session[5358]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:47:51.962303 sshd[5358]: Accepted publickey for core from 4.153.228.146 port 54816 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:47:51.972267 kernel: audit: type=1101 audit(1769215671.956:805): pid=5358 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:51.956000 audit[5358]: CRED_ACQ pid=5358 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:51.982533 systemd-logind[1656]: New session 16 of user core. Jan 24 00:47:51.994119 kernel: audit: type=1103 audit(1769215671.956:806): pid=5358 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:51.995654 systemd[1]: Started session-16.scope - Session 16 of User core. Jan 24 00:47:52.007365 kernel: audit: type=1006 audit(1769215671.956:807): pid=5358 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=16 res=1 Jan 24 00:47:52.007474 kernel: audit: type=1300 audit(1769215671.956:807): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc255a4f50 a2=3 a3=0 items=0 ppid=1 pid=5358 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:47:51.956000 audit[5358]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc255a4f50 a2=3 a3=0 items=0 ppid=1 pid=5358 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:47:51.956000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:47:52.023000 audit[5358]: USER_START pid=5358 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:52.033145 kernel: audit: type=1327 audit(1769215671.956:807): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:47:52.033282 kernel: audit: type=1105 audit(1769215672.023:808): pid=5358 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:52.028000 audit[5362]: CRED_ACQ pid=5362 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:52.041959 kernel: audit: type=1103 audit(1769215672.028:809): pid=5362 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:52.378626 sshd[5362]: Connection closed by 4.153.228.146 port 54816 Jan 24 00:47:52.381913 sshd-session[5358]: pam_unix(sshd:session): session closed for user core Jan 24 00:47:52.395275 kernel: audit: type=1106 audit(1769215672.386:810): pid=5358 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:52.386000 audit[5358]: USER_END pid=5358 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:52.396284 systemd[1]: sshd@14-65.109.167.77:22-4.153.228.146:54816.service: Deactivated successfully. Jan 24 00:47:52.396678 systemd-logind[1656]: Session 16 logged out. Waiting for processes to exit. Jan 24 00:47:52.386000 audit[5358]: CRED_DISP pid=5358 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:52.401459 systemd[1]: session-16.scope: Deactivated successfully. Jan 24 00:47:52.405104 kernel: audit: type=1104 audit(1769215672.386:811): pid=5358 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:52.404730 systemd-logind[1656]: Removed session 16. Jan 24 00:47:52.399000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-65.109.167.77:22-4.153.228.146:54816 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:47:54.621439 kubelet[2863]: E0124 00:47:54.621361 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58c54c478-wd6fd" podUID="dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5" Jan 24 00:47:55.623302 kubelet[2863]: E0124 00:47:55.623247 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" podUID="7fb26181-9fdc-4f96-be2c-85fbaa5f21b7" Jan 24 00:47:57.516000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-65.109.167.77:22-4.153.228.146:57008 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:47:57.516505 systemd[1]: Started sshd@15-65.109.167.77:22-4.153.228.146:57008.service - OpenSSH per-connection server daemon (4.153.228.146:57008). Jan 24 00:47:57.518020 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:47:57.518075 kernel: audit: type=1130 audit(1769215677.516:813): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-65.109.167.77:22-4.153.228.146:57008 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:47:58.207000 audit[5375]: USER_ACCT pid=5375 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:58.212671 sshd-session[5375]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:47:58.214933 sshd[5375]: Accepted publickey for core from 4.153.228.146 port 57008 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:47:58.223227 kernel: audit: type=1101 audit(1769215678.207:814): pid=5375 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:58.207000 audit[5375]: CRED_ACQ pid=5375 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:58.237133 kernel: audit: type=1103 audit(1769215678.207:815): pid=5375 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:58.207000 audit[5375]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff04379eb0 a2=3 a3=0 items=0 ppid=1 pid=5375 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:47:58.250619 kernel: audit: type=1006 audit(1769215678.207:816): pid=5375 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=17 res=1 Jan 24 00:47:58.250897 kernel: audit: type=1300 audit(1769215678.207:816): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff04379eb0 a2=3 a3=0 items=0 ppid=1 pid=5375 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:47:58.207000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:47:58.265158 kernel: audit: type=1327 audit(1769215678.207:816): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:47:58.268640 systemd-logind[1656]: New session 17 of user core. Jan 24 00:47:58.277302 systemd[1]: Started session-17.scope - Session 17 of User core. Jan 24 00:47:58.282000 audit[5375]: USER_START pid=5375 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:58.298373 kernel: audit: type=1105 audit(1769215678.282:817): pid=5375 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:58.298466 kernel: audit: type=1103 audit(1769215678.283:818): pid=5379 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:58.283000 audit[5379]: CRED_ACQ pid=5379 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:58.615697 kubelet[2863]: E0124 00:47:58.615645 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:47:58.654506 sshd[5379]: Connection closed by 4.153.228.146 port 57008 Jan 24 00:47:58.656282 sshd-session[5375]: pam_unix(sshd:session): session closed for user core Jan 24 00:47:58.670829 kernel: audit: type=1106 audit(1769215678.662:819): pid=5375 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:58.662000 audit[5375]: USER_END pid=5375 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:58.670418 systemd[1]: sshd@15-65.109.167.77:22-4.153.228.146:57008.service: Deactivated successfully. Jan 24 00:47:58.671925 systemd-logind[1656]: Session 17 logged out. Waiting for processes to exit. Jan 24 00:47:58.678785 systemd[1]: session-17.scope: Deactivated successfully. Jan 24 00:47:58.662000 audit[5375]: CRED_DISP pid=5375 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:47:58.685158 systemd-logind[1656]: Removed session 17. Jan 24 00:47:58.670000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-65.109.167.77:22-4.153.228.146:57008 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:47:58.689081 kernel: audit: type=1104 audit(1769215678.662:820): pid=5375 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:01.617388 kubelet[2863]: E0124 00:48:01.617339 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qhldj" podUID="d0cc0ca8-4b85-478b-b9c2-c61b42d93c89" Jan 24 00:48:02.617532 kubelet[2863]: E0124 00:48:02.617387 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" podUID="d83d59e3-6296-40d9-bb63-5a69b654ac0c" Jan 24 00:48:03.794828 systemd[1]: Started sshd@16-65.109.167.77:22-4.153.228.146:57016.service - OpenSSH per-connection server daemon (4.153.228.146:57016). Jan 24 00:48:03.808735 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:48:03.808791 kernel: audit: type=1130 audit(1769215683.794:822): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-65.109.167.77:22-4.153.228.146:57016 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:48:03.794000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-65.109.167.77:22-4.153.228.146:57016 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:48:04.476000 audit[5419]: USER_ACCT pid=5419 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:04.482265 sshd-session[5419]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:48:04.483437 kernel: audit: type=1101 audit(1769215684.476:823): pid=5419 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:04.483460 sshd[5419]: Accepted publickey for core from 4.153.228.146 port 57016 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:48:04.478000 audit[5419]: CRED_ACQ pid=5419 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:04.490133 kernel: audit: type=1103 audit(1769215684.478:824): pid=5419 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:04.493745 systemd-logind[1656]: New session 18 of user core. Jan 24 00:48:04.499343 kernel: audit: type=1006 audit(1769215684.478:825): pid=5419 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=18 res=1 Jan 24 00:48:04.499478 kernel: audit: type=1300 audit(1769215684.478:825): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffce4ed9650 a2=3 a3=0 items=0 ppid=1 pid=5419 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=18 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:48:04.478000 audit[5419]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffce4ed9650 a2=3 a3=0 items=0 ppid=1 pid=5419 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=18 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:48:04.478000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:48:04.504025 kernel: audit: type=1327 audit(1769215684.478:825): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:48:04.504605 systemd[1]: Started session-18.scope - Session 18 of User core. Jan 24 00:48:04.510000 audit[5419]: USER_START pid=5419 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:04.518127 kernel: audit: type=1105 audit(1769215684.510:826): pid=5419 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:04.518000 audit[5423]: CRED_ACQ pid=5423 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:04.526111 kernel: audit: type=1103 audit(1769215684.518:827): pid=5423 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:04.617034 kubelet[2863]: E0124 00:48:04.616965 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:48:04.961590 sshd[5423]: Connection closed by 4.153.228.146 port 57016 Jan 24 00:48:04.963242 sshd-session[5419]: pam_unix(sshd:session): session closed for user core Jan 24 00:48:04.965000 audit[5419]: USER_END pid=5419 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:04.974111 kernel: audit: type=1106 audit(1769215684.965:828): pid=5419 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:04.974264 kernel: audit: type=1104 audit(1769215684.965:829): pid=5419 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:04.965000 audit[5419]: CRED_DISP pid=5419 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:04.975305 systemd[1]: sshd@16-65.109.167.77:22-4.153.228.146:57016.service: Deactivated successfully. Jan 24 00:48:04.976000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-65.109.167.77:22-4.153.228.146:57016 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:48:04.982721 systemd[1]: session-18.scope: Deactivated successfully. Jan 24 00:48:04.986325 systemd-logind[1656]: Session 18 logged out. Waiting for processes to exit. Jan 24 00:48:04.990561 systemd-logind[1656]: Removed session 18. Jan 24 00:48:06.617685 kubelet[2863]: E0124 00:48:06.616552 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" podUID="7fb26181-9fdc-4f96-be2c-85fbaa5f21b7" Jan 24 00:48:07.618651 kubelet[2863]: E0124 00:48:07.617914 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58c54c478-wd6fd" podUID="dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5" Jan 24 00:48:09.616741 kubelet[2863]: E0124 00:48:09.616683 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:48:10.092000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-65.109.167.77:22-4.153.228.146:41564 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:48:10.092736 systemd[1]: Started sshd@17-65.109.167.77:22-4.153.228.146:41564.service - OpenSSH per-connection server daemon (4.153.228.146:41564). Jan 24 00:48:10.094181 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:48:10.094340 kernel: audit: type=1130 audit(1769215690.092:831): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-65.109.167.77:22-4.153.228.146:41564 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:48:10.748000 audit[5436]: USER_ACCT pid=5436 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:10.756000 sshd-session[5436]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:48:10.761888 sshd[5436]: Accepted publickey for core from 4.153.228.146 port 41564 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:48:10.765104 kernel: audit: type=1101 audit(1769215690.748:832): pid=5436 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:10.765619 kernel: audit: type=1103 audit(1769215690.748:833): pid=5436 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:10.748000 audit[5436]: CRED_ACQ pid=5436 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:10.784224 kernel: audit: type=1006 audit(1769215690.748:834): pid=5436 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=19 res=1 Jan 24 00:48:10.748000 audit[5436]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff10fccc20 a2=3 a3=0 items=0 ppid=1 pid=5436 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=19 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:48:10.798225 kernel: audit: type=1300 audit(1769215690.748:834): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff10fccc20 a2=3 a3=0 items=0 ppid=1 pid=5436 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=19 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:48:10.796519 systemd-logind[1656]: New session 19 of user core. Jan 24 00:48:10.748000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:48:10.809165 kernel: audit: type=1327 audit(1769215690.748:834): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:48:10.814387 systemd[1]: Started session-19.scope - Session 19 of User core. Jan 24 00:48:10.826000 audit[5436]: USER_START pid=5436 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:10.836000 audit[5440]: CRED_ACQ pid=5440 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:10.853470 kernel: audit: type=1105 audit(1769215690.826:835): pid=5436 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:10.853592 kernel: audit: type=1103 audit(1769215690.836:836): pid=5440 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:11.239186 sshd[5440]: Connection closed by 4.153.228.146 port 41564 Jan 24 00:48:11.242205 sshd-session[5436]: pam_unix(sshd:session): session closed for user core Jan 24 00:48:11.244000 audit[5436]: USER_END pid=5436 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:11.252983 systemd[1]: sshd@17-65.109.167.77:22-4.153.228.146:41564.service: Deactivated successfully. Jan 24 00:48:11.253353 kernel: audit: type=1106 audit(1769215691.244:837): pid=5436 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:11.256783 systemd[1]: session-19.scope: Deactivated successfully. Jan 24 00:48:11.261201 systemd-logind[1656]: Session 19 logged out. Waiting for processes to exit. Jan 24 00:48:11.262880 systemd-logind[1656]: Removed session 19. Jan 24 00:48:11.245000 audit[5436]: CRED_DISP pid=5436 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:11.253000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-65.109.167.77:22-4.153.228.146:41564 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:48:11.270257 kernel: audit: type=1104 audit(1769215691.245:838): pid=5436 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:13.621457 kubelet[2863]: E0124 00:48:13.621419 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qhldj" podUID="d0cc0ca8-4b85-478b-b9c2-c61b42d93c89" Jan 24 00:48:14.620083 kubelet[2863]: E0124 00:48:14.619167 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" podUID="d83d59e3-6296-40d9-bb63-5a69b654ac0c" Jan 24 00:48:16.380630 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:48:16.380730 kernel: audit: type=1130 audit(1769215696.377:840): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-65.109.167.77:22-4.153.228.146:60800 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:48:16.377000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-65.109.167.77:22-4.153.228.146:60800 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:48:16.378980 systemd[1]: Started sshd@18-65.109.167.77:22-4.153.228.146:60800.service - OpenSSH per-connection server daemon (4.153.228.146:60800). Jan 24 00:48:17.030000 audit[5454]: USER_ACCT pid=5454 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:17.031893 sshd[5454]: Accepted publickey for core from 4.153.228.146 port 60800 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:48:17.041559 sshd-session[5454]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:48:17.046737 kernel: audit: type=1101 audit(1769215697.030:841): pid=5454 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:17.034000 audit[5454]: CRED_ACQ pid=5454 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:17.065284 kernel: audit: type=1103 audit(1769215697.034:842): pid=5454 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:17.065398 kernel: audit: type=1006 audit(1769215697.034:843): pid=5454 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=20 res=1 Jan 24 00:48:17.034000 audit[5454]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff26c5e840 a2=3 a3=0 items=0 ppid=1 pid=5454 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=20 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:48:17.071275 systemd-logind[1656]: New session 20 of user core. Jan 24 00:48:17.076252 kernel: audit: type=1300 audit(1769215697.034:843): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff26c5e840 a2=3 a3=0 items=0 ppid=1 pid=5454 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=20 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:48:17.034000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:48:17.088457 kernel: audit: type=1327 audit(1769215697.034:843): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:48:17.086345 systemd[1]: Started session-20.scope - Session 20 of User core. Jan 24 00:48:17.096000 audit[5454]: USER_START pid=5454 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:17.112103 kernel: audit: type=1105 audit(1769215697.096:844): pid=5454 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:17.112202 kernel: audit: type=1103 audit(1769215697.100:845): pid=5458 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:17.100000 audit[5458]: CRED_ACQ pid=5458 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:17.536258 sshd[5458]: Connection closed by 4.153.228.146 port 60800 Jan 24 00:48:17.538343 sshd-session[5454]: pam_unix(sshd:session): session closed for user core Jan 24 00:48:17.558417 kernel: audit: type=1106 audit(1769215697.539:846): pid=5454 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:17.539000 audit[5454]: USER_END pid=5454 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:17.545439 systemd-logind[1656]: Session 20 logged out. Waiting for processes to exit. Jan 24 00:48:17.546785 systemd[1]: sshd@18-65.109.167.77:22-4.153.228.146:60800.service: Deactivated successfully. Jan 24 00:48:17.552486 systemd[1]: session-20.scope: Deactivated successfully. Jan 24 00:48:17.558291 systemd-logind[1656]: Removed session 20. Jan 24 00:48:17.539000 audit[5454]: CRED_DISP pid=5454 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:17.543000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-65.109.167.77:22-4.153.228.146:60800 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:48:17.573128 kernel: audit: type=1104 audit(1769215697.539:847): pid=5454 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:18.618138 kubelet[2863]: E0124 00:48:18.617866 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" podUID="7fb26181-9fdc-4f96-be2c-85fbaa5f21b7" Jan 24 00:48:18.620524 kubelet[2863]: E0124 00:48:18.620464 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:48:19.618505 kubelet[2863]: E0124 00:48:19.618438 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58c54c478-wd6fd" podUID="dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5" Jan 24 00:48:22.618165 kubelet[2863]: E0124 00:48:22.618094 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:48:22.675262 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:48:22.675347 kernel: audit: type=1130 audit(1769215702.667:849): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-65.109.167.77:22-4.153.228.146:60802 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:48:22.667000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-65.109.167.77:22-4.153.228.146:60802 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:48:22.668699 systemd[1]: Started sshd@19-65.109.167.77:22-4.153.228.146:60802.service - OpenSSH per-connection server daemon (4.153.228.146:60802). Jan 24 00:48:23.334000 audit[5471]: USER_ACCT pid=5471 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:23.340013 sshd-session[5471]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:48:23.341635 sshd[5471]: Accepted publickey for core from 4.153.228.146 port 60802 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:48:23.334000 audit[5471]: CRED_ACQ pid=5471 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:23.354305 kernel: audit: type=1101 audit(1769215703.334:850): pid=5471 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:23.354397 kernel: audit: type=1103 audit(1769215703.334:851): pid=5471 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:23.354504 systemd-logind[1656]: New session 21 of user core. Jan 24 00:48:23.366257 kernel: audit: type=1006 audit(1769215703.334:852): pid=5471 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=21 res=1 Jan 24 00:48:23.334000 audit[5471]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffecc37f4a0 a2=3 a3=0 items=0 ppid=1 pid=5471 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=21 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:48:23.378007 kernel: audit: type=1300 audit(1769215703.334:852): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffecc37f4a0 a2=3 a3=0 items=0 ppid=1 pid=5471 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=21 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:48:23.378916 systemd[1]: Started session-21.scope - Session 21 of User core. Jan 24 00:48:23.334000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:48:23.394120 kernel: audit: type=1327 audit(1769215703.334:852): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:48:23.386000 audit[5471]: USER_START pid=5471 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:23.395000 audit[5475]: CRED_ACQ pid=5475 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:23.414595 kernel: audit: type=1105 audit(1769215703.386:853): pid=5471 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:23.414686 kernel: audit: type=1103 audit(1769215703.395:854): pid=5475 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:23.790241 sshd[5475]: Connection closed by 4.153.228.146 port 60802 Jan 24 00:48:23.792318 sshd-session[5471]: pam_unix(sshd:session): session closed for user core Jan 24 00:48:23.792000 audit[5471]: USER_END pid=5471 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:23.801135 kernel: audit: type=1106 audit(1769215703.792:855): pid=5471 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:23.801494 systemd[1]: sshd@19-65.109.167.77:22-4.153.228.146:60802.service: Deactivated successfully. Jan 24 00:48:23.794000 audit[5471]: CRED_DISP pid=5471 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:23.803829 systemd[1]: session-21.scope: Deactivated successfully. Jan 24 00:48:23.804767 systemd-logind[1656]: Session 21 logged out. Waiting for processes to exit. Jan 24 00:48:23.807205 systemd-logind[1656]: Removed session 21. Jan 24 00:48:23.816483 kernel: audit: type=1104 audit(1769215703.794:856): pid=5471 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:23.800000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-65.109.167.77:22-4.153.228.146:60802 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:48:26.617028 kubelet[2863]: E0124 00:48:26.616956 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" podUID="d83d59e3-6296-40d9-bb63-5a69b654ac0c" Jan 24 00:48:28.615617 kubelet[2863]: E0124 00:48:28.615547 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qhldj" podUID="d0cc0ca8-4b85-478b-b9c2-c61b42d93c89" Jan 24 00:48:28.930000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-65.109.167.77:22-4.153.228.146:37404 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:48:28.931150 systemd[1]: Started sshd@20-65.109.167.77:22-4.153.228.146:37404.service - OpenSSH per-connection server daemon (4.153.228.146:37404). Jan 24 00:48:28.933982 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:48:28.934126 kernel: audit: type=1130 audit(1769215708.930:858): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-65.109.167.77:22-4.153.228.146:37404 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:48:29.623000 audit[5491]: USER_ACCT pid=5491 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:29.635399 sshd[5491]: Accepted publickey for core from 4.153.228.146 port 37404 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:48:29.637079 kernel: audit: type=1101 audit(1769215709.623:859): pid=5491 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:29.638154 sshd-session[5491]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:48:29.646320 systemd-logind[1656]: New session 22 of user core. Jan 24 00:48:29.636000 audit[5491]: CRED_ACQ pid=5491 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:29.653081 kernel: audit: type=1103 audit(1769215709.636:860): pid=5491 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:29.658952 systemd[1]: Started session-22.scope - Session 22 of User core. Jan 24 00:48:29.659080 kernel: audit: type=1006 audit(1769215709.636:861): pid=5491 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=22 res=1 Jan 24 00:48:29.636000 audit[5491]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fffebe23590 a2=3 a3=0 items=0 ppid=1 pid=5491 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=22 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:48:29.671080 kernel: audit: type=1300 audit(1769215709.636:861): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fffebe23590 a2=3 a3=0 items=0 ppid=1 pid=5491 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=22 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:48:29.636000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:48:29.674081 kernel: audit: type=1327 audit(1769215709.636:861): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:48:29.661000 audit[5491]: USER_START pid=5491 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:29.670000 audit[5497]: CRED_ACQ pid=5497 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:29.682727 kernel: audit: type=1105 audit(1769215709.661:862): pid=5491 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:29.682765 kernel: audit: type=1103 audit(1769215709.670:863): pid=5497 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:30.060387 sshd[5497]: Connection closed by 4.153.228.146 port 37404 Jan 24 00:48:30.060400 sshd-session[5491]: pam_unix(sshd:session): session closed for user core Jan 24 00:48:30.062000 audit[5491]: USER_END pid=5491 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:30.072588 systemd-logind[1656]: Session 22 logged out. Waiting for processes to exit. Jan 24 00:48:30.076947 systemd[1]: sshd@20-65.109.167.77:22-4.153.228.146:37404.service: Deactivated successfully. Jan 24 00:48:30.081135 kernel: audit: type=1106 audit(1769215710.062:864): pid=5491 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:30.094374 kernel: audit: type=1104 audit(1769215710.063:865): pid=5491 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:30.063000 audit[5491]: CRED_DISP pid=5491 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:30.086409 systemd[1]: session-22.scope: Deactivated successfully. Jan 24 00:48:30.092034 systemd-logind[1656]: Removed session 22. Jan 24 00:48:30.077000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-65.109.167.77:22-4.153.228.146:37404 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:48:30.618368 kubelet[2863]: E0124 00:48:30.617449 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" podUID="7fb26181-9fdc-4f96-be2c-85fbaa5f21b7" Jan 24 00:48:31.618328 kubelet[2863]: E0124 00:48:31.617572 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:48:32.616751 kubelet[2863]: E0124 00:48:32.616688 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58c54c478-wd6fd" podUID="dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5" Jan 24 00:48:35.204532 systemd[1]: Started sshd@21-65.109.167.77:22-4.153.228.146:39990.service - OpenSSH per-connection server daemon (4.153.228.146:39990). Jan 24 00:48:35.219359 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:48:35.219418 kernel: audit: type=1130 audit(1769215715.203:867): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-65.109.167.77:22-4.153.228.146:39990 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:48:35.203000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-65.109.167.77:22-4.153.228.146:39990 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:48:35.920000 audit[5533]: USER_ACCT pid=5533 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:35.924118 sshd[5533]: Accepted publickey for core from 4.153.228.146 port 39990 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:48:35.928793 kernel: audit: type=1101 audit(1769215715.920:868): pid=5533 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:35.926935 sshd-session[5533]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:48:35.922000 audit[5533]: CRED_ACQ pid=5533 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:35.938111 kernel: audit: type=1103 audit(1769215715.922:869): pid=5533 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:35.945124 kernel: audit: type=1006 audit(1769215715.922:870): pid=5533 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=23 res=1 Jan 24 00:48:35.944879 systemd-logind[1656]: New session 23 of user core. Jan 24 00:48:35.922000 audit[5533]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffed775a310 a2=3 a3=0 items=0 ppid=1 pid=5533 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=23 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:48:35.949604 systemd[1]: Started session-23.scope - Session 23 of User core. Jan 24 00:48:35.953473 kernel: audit: type=1300 audit(1769215715.922:870): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffed775a310 a2=3 a3=0 items=0 ppid=1 pid=5533 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=23 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:48:35.922000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:48:35.955000 audit[5533]: USER_START pid=5533 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:35.963074 kernel: audit: type=1327 audit(1769215715.922:870): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:48:35.963114 kernel: audit: type=1105 audit(1769215715.955:871): pid=5533 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:35.959000 audit[5537]: CRED_ACQ pid=5537 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:35.977082 kernel: audit: type=1103 audit(1769215715.959:872): pid=5537 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:36.433359 sshd[5537]: Connection closed by 4.153.228.146 port 39990 Jan 24 00:48:36.434370 sshd-session[5533]: pam_unix(sshd:session): session closed for user core Jan 24 00:48:36.435000 audit[5533]: USER_END pid=5533 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:36.442002 systemd-logind[1656]: Session 23 logged out. Waiting for processes to exit. Jan 24 00:48:36.445917 systemd[1]: sshd@21-65.109.167.77:22-4.153.228.146:39990.service: Deactivated successfully. Jan 24 00:48:36.453551 systemd[1]: session-23.scope: Deactivated successfully. Jan 24 00:48:36.435000 audit[5533]: CRED_DISP pid=5533 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:36.456416 kernel: audit: type=1106 audit(1769215716.435:873): pid=5533 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:36.456690 kernel: audit: type=1104 audit(1769215716.435:874): pid=5533 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:36.459940 systemd-logind[1656]: Removed session 23. Jan 24 00:48:36.445000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-65.109.167.77:22-4.153.228.146:39990 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:48:36.616829 kubelet[2863]: E0124 00:48:36.616581 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:48:38.616113 kubelet[2863]: E0124 00:48:38.615786 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" podUID="d83d59e3-6296-40d9-bb63-5a69b654ac0c" Jan 24 00:48:41.572996 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:48:41.576156 kernel: audit: type=1130 audit(1769215721.564:876): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-65.109.167.77:22-4.153.228.146:40000 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:48:41.564000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-65.109.167.77:22-4.153.228.146:40000 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:48:41.565573 systemd[1]: Started sshd@22-65.109.167.77:22-4.153.228.146:40000.service - OpenSSH per-connection server daemon (4.153.228.146:40000). Jan 24 00:48:42.233000 audit[5550]: USER_ACCT pid=5550 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:42.239397 sshd-session[5550]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:48:42.240476 sshd[5550]: Accepted publickey for core from 4.153.228.146 port 40000 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:48:42.251326 kernel: audit: type=1101 audit(1769215722.233:877): pid=5550 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:42.233000 audit[5550]: CRED_ACQ pid=5550 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:42.263571 systemd-logind[1656]: New session 24 of user core. Jan 24 00:48:42.268520 kernel: audit: type=1103 audit(1769215722.233:878): pid=5550 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:42.233000 audit[5550]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff42963420 a2=3 a3=0 items=0 ppid=1 pid=5550 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=24 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:48:42.282352 kernel: audit: type=1006 audit(1769215722.233:879): pid=5550 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=24 res=1 Jan 24 00:48:42.282431 kernel: audit: type=1300 audit(1769215722.233:879): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff42963420 a2=3 a3=0 items=0 ppid=1 pid=5550 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=24 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:48:42.283447 systemd[1]: Started session-24.scope - Session 24 of User core. Jan 24 00:48:42.233000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:48:42.294520 kernel: audit: type=1327 audit(1769215722.233:879): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:48:42.292000 audit[5550]: USER_START pid=5550 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:42.301708 kernel: audit: type=1105 audit(1769215722.292:880): pid=5550 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:42.297000 audit[5554]: CRED_ACQ pid=5554 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:42.313980 kernel: audit: type=1103 audit(1769215722.297:881): pid=5554 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:42.617200 kubelet[2863]: E0124 00:48:42.617105 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:48:42.717992 sshd[5554]: Connection closed by 4.153.228.146 port 40000 Jan 24 00:48:42.719377 sshd-session[5550]: pam_unix(sshd:session): session closed for user core Jan 24 00:48:42.720000 audit[5550]: USER_END pid=5550 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:42.724732 systemd[1]: sshd@22-65.109.167.77:22-4.153.228.146:40000.service: Deactivated successfully. Jan 24 00:48:42.727963 systemd[1]: session-24.scope: Deactivated successfully. Jan 24 00:48:42.732745 systemd-logind[1656]: Session 24 logged out. Waiting for processes to exit. Jan 24 00:48:42.733959 systemd-logind[1656]: Removed session 24. Jan 24 00:48:42.720000 audit[5550]: CRED_DISP pid=5550 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:42.747128 kernel: audit: type=1106 audit(1769215722.720:882): pid=5550 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:42.747270 kernel: audit: type=1104 audit(1769215722.720:883): pid=5550 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:42.724000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-65.109.167.77:22-4.153.228.146:40000 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:48:43.617890 kubelet[2863]: E0124 00:48:43.617748 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" podUID="7fb26181-9fdc-4f96-be2c-85fbaa5f21b7" Jan 24 00:48:43.617890 kubelet[2863]: E0124 00:48:43.617806 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qhldj" podUID="d0cc0ca8-4b85-478b-b9c2-c61b42d93c89" Jan 24 00:48:46.617450 containerd[1682]: time="2026-01-24T00:48:46.617264813Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 24 00:48:47.051163 containerd[1682]: time="2026-01-24T00:48:47.050869061Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 24 00:48:47.053877 containerd[1682]: time="2026-01-24T00:48:47.053821652Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 24 00:48:47.053877 containerd[1682]: time="2026-01-24T00:48:47.053910822Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 24 00:48:47.054166 kubelet[2863]: E0124 00:48:47.054108 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 24 00:48:47.054619 kubelet[2863]: E0124 00:48:47.054176 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 24 00:48:47.054619 kubelet[2863]: E0124 00:48:47.054329 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:c13ad1f9d9cd499a81c814dc308eb491,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dlnt8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-58c54c478-wd6fd_calico-system(dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 24 00:48:47.056921 containerd[1682]: time="2026-01-24T00:48:47.056860264Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 24 00:48:47.491563 containerd[1682]: time="2026-01-24T00:48:47.491465990Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 24 00:48:47.493148 containerd[1682]: time="2026-01-24T00:48:47.493090296Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 24 00:48:47.493387 containerd[1682]: time="2026-01-24T00:48:47.493159186Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 24 00:48:47.493521 kubelet[2863]: E0124 00:48:47.493324 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 24 00:48:47.493521 kubelet[2863]: E0124 00:48:47.493370 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 24 00:48:47.493521 kubelet[2863]: E0124 00:48:47.493456 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dlnt8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-58c54c478-wd6fd_calico-system(dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 24 00:48:47.494826 kubelet[2863]: E0124 00:48:47.494772 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58c54c478-wd6fd" podUID="dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5" Jan 24 00:48:47.861142 systemd[1]: Started sshd@23-65.109.167.77:22-4.153.228.146:55154.service - OpenSSH per-connection server daemon (4.153.228.146:55154). Jan 24 00:48:47.876983 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:48:47.877118 kernel: audit: type=1130 audit(1769215727.860:885): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-65.109.167.77:22-4.153.228.146:55154 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:48:47.860000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-65.109.167.77:22-4.153.228.146:55154 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:48:48.532000 audit[5567]: USER_ACCT pid=5567 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:48.535583 sshd-session[5567]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:48:48.537457 sshd[5567]: Accepted publickey for core from 4.153.228.146 port 55154 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:48:48.556107 kernel: audit: type=1101 audit(1769215728.532:886): pid=5567 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:48.556184 kernel: audit: type=1103 audit(1769215728.532:887): pid=5567 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:48.532000 audit[5567]: CRED_ACQ pid=5567 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:48.549512 systemd-logind[1656]: New session 25 of user core. Jan 24 00:48:48.561305 systemd[1]: Started session-25.scope - Session 25 of User core. Jan 24 00:48:48.532000 audit[5567]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffdb67127e0 a2=3 a3=0 items=0 ppid=1 pid=5567 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=25 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:48:48.579011 kernel: audit: type=1006 audit(1769215728.532:888): pid=5567 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=25 res=1 Jan 24 00:48:48.579138 kernel: audit: type=1300 audit(1769215728.532:888): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffdb67127e0 a2=3 a3=0 items=0 ppid=1 pid=5567 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=25 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:48:48.532000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:48:48.567000 audit[5567]: USER_START pid=5567 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:48.584481 kernel: audit: type=1327 audit(1769215728.532:888): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:48:48.584540 kernel: audit: type=1105 audit(1769215728.567:889): pid=5567 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:48.567000 audit[5571]: CRED_ACQ pid=5571 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:48.592005 kernel: audit: type=1103 audit(1769215728.567:890): pid=5571 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:48.616825 kubelet[2863]: E0124 00:48:48.616571 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:48:49.013539 sshd[5571]: Connection closed by 4.153.228.146 port 55154 Jan 24 00:48:49.015017 sshd-session[5567]: pam_unix(sshd:session): session closed for user core Jan 24 00:48:49.021000 audit[5567]: USER_END pid=5567 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:49.035566 systemd[1]: sshd@23-65.109.167.77:22-4.153.228.146:55154.service: Deactivated successfully. Jan 24 00:48:49.040326 kernel: audit: type=1106 audit(1769215729.021:891): pid=5567 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:49.055288 kernel: audit: type=1104 audit(1769215729.021:892): pid=5567 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:49.021000 audit[5567]: CRED_DISP pid=5567 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:49.046804 systemd[1]: session-25.scope: Deactivated successfully. Jan 24 00:48:49.052674 systemd-logind[1656]: Session 25 logged out. Waiting for processes to exit. Jan 24 00:48:49.032000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-65.109.167.77:22-4.153.228.146:55154 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:48:49.059547 systemd-logind[1656]: Removed session 25. Jan 24 00:48:53.621499 kubelet[2863]: E0124 00:48:53.620005 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" podUID="d83d59e3-6296-40d9-bb63-5a69b654ac0c" Jan 24 00:48:54.155215 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:48:54.155301 kernel: audit: type=1130 audit(1769215734.149:894): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-65.109.167.77:22-4.153.228.146:55162 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:48:54.149000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-65.109.167.77:22-4.153.228.146:55162 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:48:54.149848 systemd[1]: Started sshd@24-65.109.167.77:22-4.153.228.146:55162.service - OpenSSH per-connection server daemon (4.153.228.146:55162). Jan 24 00:48:54.618769 kubelet[2863]: E0124 00:48:54.618730 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" podUID="7fb26181-9fdc-4f96-be2c-85fbaa5f21b7" Jan 24 00:48:54.619948 kubelet[2863]: E0124 00:48:54.619923 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:48:54.824000 audit[5591]: USER_ACCT pid=5591 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:54.829211 sshd-session[5591]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:48:54.831885 sshd[5591]: Accepted publickey for core from 4.153.228.146 port 55162 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:48:54.838844 kernel: audit: type=1101 audit(1769215734.824:895): pid=5591 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:54.838914 kernel: audit: type=1103 audit(1769215734.828:896): pid=5591 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:54.828000 audit[5591]: CRED_ACQ pid=5591 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:54.837925 systemd-logind[1656]: New session 26 of user core. Jan 24 00:48:54.843116 kernel: audit: type=1006 audit(1769215734.828:897): pid=5591 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=26 res=1 Jan 24 00:48:54.841985 systemd[1]: Started session-26.scope - Session 26 of User core. Jan 24 00:48:54.828000 audit[5591]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc2bb97f50 a2=3 a3=0 items=0 ppid=1 pid=5591 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=26 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:48:54.852611 kernel: audit: type=1300 audit(1769215734.828:897): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc2bb97f50 a2=3 a3=0 items=0 ppid=1 pid=5591 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=26 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:48:54.852676 kernel: audit: type=1327 audit(1769215734.828:897): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:48:54.828000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:48:54.856689 kernel: audit: type=1105 audit(1769215734.848:898): pid=5591 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:54.848000 audit[5591]: USER_START pid=5591 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:54.850000 audit[5595]: CRED_ACQ pid=5595 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:54.861471 kernel: audit: type=1103 audit(1769215734.850:899): pid=5595 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:55.259942 sshd[5595]: Connection closed by 4.153.228.146 port 55162 Jan 24 00:48:55.262367 sshd-session[5591]: pam_unix(sshd:session): session closed for user core Jan 24 00:48:55.266000 audit[5591]: USER_END pid=5591 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:55.275007 systemd[1]: sshd@24-65.109.167.77:22-4.153.228.146:55162.service: Deactivated successfully. Jan 24 00:48:55.282024 systemd[1]: session-26.scope: Deactivated successfully. Jan 24 00:48:55.284100 kernel: audit: type=1106 audit(1769215735.266:900): pid=5591 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:55.300198 kernel: audit: type=1104 audit(1769215735.266:901): pid=5591 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:55.266000 audit[5591]: CRED_DISP pid=5591 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:48:55.289056 systemd-logind[1656]: Session 26 logged out. Waiting for processes to exit. Jan 24 00:48:55.293402 systemd-logind[1656]: Removed session 26. Jan 24 00:48:55.276000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-65.109.167.77:22-4.153.228.146:55162 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:48:58.617455 containerd[1682]: time="2026-01-24T00:48:58.617040656Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 24 00:48:59.059241 containerd[1682]: time="2026-01-24T00:48:59.058118941Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 24 00:48:59.060691 containerd[1682]: time="2026-01-24T00:48:59.060605509Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 24 00:48:59.061287 containerd[1682]: time="2026-01-24T00:48:59.060799949Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 24 00:48:59.062169 kubelet[2863]: E0124 00:48:59.061854 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 24 00:48:59.062169 kubelet[2863]: E0124 00:48:59.061913 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 24 00:48:59.064145 kubelet[2863]: E0124 00:48:59.062135 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w7bh8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-qhldj_calico-system(d0cc0ca8-4b85-478b-b9c2-c61b42d93c89): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 24 00:48:59.064145 kubelet[2863]: E0124 00:48:59.063953 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qhldj" podUID="d0cc0ca8-4b85-478b-b9c2-c61b42d93c89" Jan 24 00:49:00.399568 systemd[1]: Started sshd@25-65.109.167.77:22-4.153.228.146:52322.service - OpenSSH per-connection server daemon (4.153.228.146:52322). Jan 24 00:49:00.402200 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:49:00.402345 kernel: audit: type=1130 audit(1769215740.400:903): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@25-65.109.167.77:22-4.153.228.146:52322 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:49:00.400000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@25-65.109.167.77:22-4.153.228.146:52322 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:49:01.100000 audit[5609]: USER_ACCT pid=5609 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:01.106259 sshd[5609]: Accepted publickey for core from 4.153.228.146 port 52322 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:49:01.107109 kernel: audit: type=1101 audit(1769215741.100:904): pid=5609 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:01.108387 sshd-session[5609]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:49:01.115263 kernel: audit: type=1103 audit(1769215741.107:905): pid=5609 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:01.107000 audit[5609]: CRED_ACQ pid=5609 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:01.128179 kernel: audit: type=1006 audit(1769215741.107:906): pid=5609 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=27 res=1 Jan 24 00:49:01.128338 kernel: audit: type=1300 audit(1769215741.107:906): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffcf01520a0 a2=3 a3=0 items=0 ppid=1 pid=5609 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=27 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:49:01.107000 audit[5609]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffcf01520a0 a2=3 a3=0 items=0 ppid=1 pid=5609 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=27 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:49:01.107000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:49:01.130822 kernel: audit: type=1327 audit(1769215741.107:906): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:49:01.136944 systemd-logind[1656]: New session 27 of user core. Jan 24 00:49:01.141370 systemd[1]: Started session-27.scope - Session 27 of User core. Jan 24 00:49:01.146000 audit[5609]: USER_START pid=5609 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:01.154148 kernel: audit: type=1105 audit(1769215741.146:907): pid=5609 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:01.149000 audit[5613]: CRED_ACQ pid=5613 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:01.165082 kernel: audit: type=1103 audit(1769215741.149:908): pid=5613 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:01.549363 sshd[5613]: Connection closed by 4.153.228.146 port 52322 Jan 24 00:49:01.550178 sshd-session[5609]: pam_unix(sshd:session): session closed for user core Jan 24 00:49:01.551000 audit[5609]: USER_END pid=5609 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:01.555635 systemd[1]: sshd@25-65.109.167.77:22-4.153.228.146:52322.service: Deactivated successfully. Jan 24 00:49:01.558288 systemd[1]: session-27.scope: Deactivated successfully. Jan 24 00:49:01.560530 kernel: audit: type=1106 audit(1769215741.551:909): pid=5609 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:01.560849 kernel: audit: type=1104 audit(1769215741.551:910): pid=5609 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:01.551000 audit[5609]: CRED_DISP pid=5609 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:01.560567 systemd-logind[1656]: Session 27 logged out. Waiting for processes to exit. Jan 24 00:49:01.561924 systemd-logind[1656]: Removed session 27. Jan 24 00:49:01.554000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@25-65.109.167.77:22-4.153.228.146:52322 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:49:01.617229 containerd[1682]: time="2026-01-24T00:49:01.616757686Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 24 00:49:01.623226 kubelet[2863]: E0124 00:49:01.623155 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58c54c478-wd6fd" podUID="dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5" Jan 24 00:49:02.072455 containerd[1682]: time="2026-01-24T00:49:02.072189860Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 24 00:49:02.074101 containerd[1682]: time="2026-01-24T00:49:02.073989475Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 24 00:49:02.074437 containerd[1682]: time="2026-01-24T00:49:02.074161155Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 24 00:49:02.075704 kubelet[2863]: E0124 00:49:02.075237 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 24 00:49:02.075704 kubelet[2863]: E0124 00:49:02.075295 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 24 00:49:02.075704 kubelet[2863]: E0124 00:49:02.075538 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f5fsn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-58fdcd774c-w2drb_calico-system(f546e732-cf0b-44c7-9678-ae1cb31a23a4): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 24 00:49:02.077441 kubelet[2863]: E0124 00:49:02.077303 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:49:05.620846 containerd[1682]: time="2026-01-24T00:49:05.620729656Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 24 00:49:06.045162 containerd[1682]: time="2026-01-24T00:49:06.045011018Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 24 00:49:06.046440 containerd[1682]: time="2026-01-24T00:49:06.046407742Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 24 00:49:06.046503 containerd[1682]: time="2026-01-24T00:49:06.046474472Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 24 00:49:06.046649 kubelet[2863]: E0124 00:49:06.046618 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 24 00:49:06.047961 kubelet[2863]: E0124 00:49:06.047122 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 24 00:49:06.047961 kubelet[2863]: E0124 00:49:06.047256 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tm5dq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-7779cd58b4-xp2rb_calico-apiserver(d83d59e3-6296-40d9-bb63-5a69b654ac0c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 24 00:49:06.048572 kubelet[2863]: E0124 00:49:06.048539 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" podUID="d83d59e3-6296-40d9-bb63-5a69b654ac0c" Jan 24 00:49:06.617032 containerd[1682]: time="2026-01-24T00:49:06.616494518Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 24 00:49:06.689000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@26-65.109.167.77:22-4.153.228.146:46088 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:49:06.689555 systemd[1]: Started sshd@26-65.109.167.77:22-4.153.228.146:46088.service - OpenSSH per-connection server daemon (4.153.228.146:46088). Jan 24 00:49:06.691819 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:49:06.692033 kernel: audit: type=1130 audit(1769215746.689:912): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@26-65.109.167.77:22-4.153.228.146:46088 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:49:07.049331 containerd[1682]: time="2026-01-24T00:49:07.049157293Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 24 00:49:07.051312 containerd[1682]: time="2026-01-24T00:49:07.051222368Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 24 00:49:07.051428 containerd[1682]: time="2026-01-24T00:49:07.051369729Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 24 00:49:07.051602 kubelet[2863]: E0124 00:49:07.051546 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 24 00:49:07.052565 kubelet[2863]: E0124 00:49:07.051612 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 24 00:49:07.052565 kubelet[2863]: E0124 00:49:07.051913 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-llmb2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-7779cd58b4-jsxfd_calico-apiserver(7fb26181-9fdc-4f96-be2c-85fbaa5f21b7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 24 00:49:07.052759 containerd[1682]: time="2026-01-24T00:49:07.051961040Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 24 00:49:07.054693 kubelet[2863]: E0124 00:49:07.054193 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" podUID="7fb26181-9fdc-4f96-be2c-85fbaa5f21b7" Jan 24 00:49:07.392000 audit[5652]: USER_ACCT pid=5652 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:07.393420 sshd[5652]: Accepted publickey for core from 4.153.228.146 port 46088 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:49:07.395268 sshd-session[5652]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:49:07.392000 audit[5652]: CRED_ACQ pid=5652 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:07.400369 kernel: audit: type=1101 audit(1769215747.392:913): pid=5652 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:07.400428 kernel: audit: type=1103 audit(1769215747.392:914): pid=5652 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:07.415092 kernel: audit: type=1006 audit(1769215747.392:915): pid=5652 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=28 res=1 Jan 24 00:49:07.415182 kernel: audit: type=1300 audit(1769215747.392:915): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffee6005050 a2=3 a3=0 items=0 ppid=1 pid=5652 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=28 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:49:07.392000 audit[5652]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffee6005050 a2=3 a3=0 items=0 ppid=1 pid=5652 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=28 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:49:07.415025 systemd-logind[1656]: New session 28 of user core. Jan 24 00:49:07.392000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:49:07.419094 kernel: audit: type=1327 audit(1769215747.392:915): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:49:07.421237 systemd[1]: Started session-28.scope - Session 28 of User core. Jan 24 00:49:07.424000 audit[5652]: USER_START pid=5652 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:07.432309 kernel: audit: type=1105 audit(1769215747.424:916): pid=5652 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:07.432000 audit[5656]: CRED_ACQ pid=5656 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:07.438098 kernel: audit: type=1103 audit(1769215747.432:917): pid=5656 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:07.495212 containerd[1682]: time="2026-01-24T00:49:07.495153342Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 24 00:49:07.496583 containerd[1682]: time="2026-01-24T00:49:07.496545905Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 24 00:49:07.496583 containerd[1682]: time="2026-01-24T00:49:07.496606136Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 24 00:49:07.497177 kubelet[2863]: E0124 00:49:07.496855 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 24 00:49:07.497177 kubelet[2863]: E0124 00:49:07.496909 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 24 00:49:07.497323 kubelet[2863]: E0124 00:49:07.497034 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fs8rv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-njf24_calico-system(12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 24 00:49:07.500426 containerd[1682]: time="2026-01-24T00:49:07.500349456Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 24 00:49:07.811125 sshd[5656]: Connection closed by 4.153.228.146 port 46088 Jan 24 00:49:07.812087 sshd-session[5652]: pam_unix(sshd:session): session closed for user core Jan 24 00:49:07.823083 kernel: audit: type=1106 audit(1769215747.815:918): pid=5652 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:07.815000 audit[5652]: USER_END pid=5652 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:07.818783 systemd[1]: sshd@26-65.109.167.77:22-4.153.228.146:46088.service: Deactivated successfully. Jan 24 00:49:07.823158 systemd[1]: session-28.scope: Deactivated successfully. Jan 24 00:49:07.815000 audit[5652]: CRED_DISP pid=5652 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:07.829724 systemd-logind[1656]: Session 28 logged out. Waiting for processes to exit. Jan 24 00:49:07.830979 kernel: audit: type=1104 audit(1769215747.815:919): pid=5652 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:07.831286 systemd-logind[1656]: Removed session 28. Jan 24 00:49:07.817000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@26-65.109.167.77:22-4.153.228.146:46088 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:49:07.932353 containerd[1682]: time="2026-01-24T00:49:07.932296768Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 24 00:49:07.934255 containerd[1682]: time="2026-01-24T00:49:07.934117943Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 24 00:49:07.934255 containerd[1682]: time="2026-01-24T00:49:07.934186334Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 24 00:49:07.934608 kubelet[2863]: E0124 00:49:07.934547 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 24 00:49:07.934713 kubelet[2863]: E0124 00:49:07.934613 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 24 00:49:07.935081 kubelet[2863]: E0124 00:49:07.934868 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fs8rv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-njf24_calico-system(12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 24 00:49:07.936213 kubelet[2863]: E0124 00:49:07.936159 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:49:12.620104 kubelet[2863]: E0124 00:49:12.619887 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qhldj" podUID="d0cc0ca8-4b85-478b-b9c2-c61b42d93c89" Jan 24 00:49:12.961959 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:49:12.962147 kernel: audit: type=1130 audit(1769215752.952:921): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@27-65.109.167.77:22-4.153.228.146:46096 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:49:12.952000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@27-65.109.167.77:22-4.153.228.146:46096 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:49:12.952803 systemd[1]: Started sshd@27-65.109.167.77:22-4.153.228.146:46096.service - OpenSSH per-connection server daemon (4.153.228.146:46096). Jan 24 00:49:13.622221 kubelet[2863]: E0124 00:49:13.622156 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58c54c478-wd6fd" podUID="dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5" Jan 24 00:49:13.646000 audit[5669]: USER_ACCT pid=5669 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:13.658117 kernel: audit: type=1101 audit(1769215753.646:922): pid=5669 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:13.660167 sshd[5669]: Accepted publickey for core from 4.153.228.146 port 46096 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:49:13.661798 sshd-session[5669]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:49:13.659000 audit[5669]: CRED_ACQ pid=5669 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:13.675024 kernel: audit: type=1103 audit(1769215753.659:923): pid=5669 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:13.675120 kernel: audit: type=1006 audit(1769215753.660:924): pid=5669 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=29 res=1 Jan 24 00:49:13.660000 audit[5669]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffcdd126440 a2=3 a3=0 items=0 ppid=1 pid=5669 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=29 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:49:13.660000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:49:13.687401 kernel: audit: type=1300 audit(1769215753.660:924): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffcdd126440 a2=3 a3=0 items=0 ppid=1 pid=5669 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=29 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:49:13.687463 kernel: audit: type=1327 audit(1769215753.660:924): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:49:13.694077 systemd-logind[1656]: New session 29 of user core. Jan 24 00:49:13.698278 systemd[1]: Started session-29.scope - Session 29 of User core. Jan 24 00:49:13.702000 audit[5669]: USER_START pid=5669 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:13.713000 audit[5673]: CRED_ACQ pid=5673 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:13.718856 kernel: audit: type=1105 audit(1769215753.702:925): pid=5669 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:13.718908 kernel: audit: type=1103 audit(1769215753.713:926): pid=5673 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:14.077733 sshd[5673]: Connection closed by 4.153.228.146 port 46096 Jan 24 00:49:14.078297 sshd-session[5669]: pam_unix(sshd:session): session closed for user core Jan 24 00:49:14.080000 audit[5669]: USER_END pid=5669 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:14.082875 systemd-logind[1656]: Session 29 logged out. Waiting for processes to exit. Jan 24 00:49:14.084811 systemd[1]: sshd@27-65.109.167.77:22-4.153.228.146:46096.service: Deactivated successfully. Jan 24 00:49:14.087192 systemd[1]: session-29.scope: Deactivated successfully. Jan 24 00:49:14.089077 systemd-logind[1656]: Removed session 29. Jan 24 00:49:14.080000 audit[5669]: CRED_DISP pid=5669 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:14.101556 kernel: audit: type=1106 audit(1769215754.080:927): pid=5669 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:14.101613 kernel: audit: type=1104 audit(1769215754.080:928): pid=5669 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:14.085000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@27-65.109.167.77:22-4.153.228.146:46096 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:49:16.616463 kubelet[2863]: E0124 00:49:16.616187 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:49:17.619413 kubelet[2863]: E0124 00:49:17.619217 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" podUID="d83d59e3-6296-40d9-bb63-5a69b654ac0c" Jan 24 00:49:19.210466 systemd[1]: Started sshd@28-65.109.167.77:22-4.153.228.146:48454.service - OpenSSH per-connection server daemon (4.153.228.146:48454). Jan 24 00:49:19.210000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@28-65.109.167.77:22-4.153.228.146:48454 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:49:19.212694 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:49:19.212828 kernel: audit: type=1130 audit(1769215759.210:930): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@28-65.109.167.77:22-4.153.228.146:48454 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:49:19.889000 audit[5686]: USER_ACCT pid=5686 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:19.895228 sshd-session[5686]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:49:19.901580 sshd[5686]: Accepted publickey for core from 4.153.228.146 port 48454 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:49:19.892000 audit[5686]: CRED_ACQ pid=5686 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:19.909018 kernel: audit: type=1101 audit(1769215759.889:931): pid=5686 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:19.909637 kernel: audit: type=1103 audit(1769215759.892:932): pid=5686 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:19.916875 systemd-logind[1656]: New session 30 of user core. Jan 24 00:49:19.923003 kernel: audit: type=1006 audit(1769215759.892:933): pid=5686 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=30 res=1 Jan 24 00:49:19.892000 audit[5686]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc234613f0 a2=3 a3=0 items=0 ppid=1 pid=5686 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=30 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:49:19.929272 kernel: audit: type=1300 audit(1769215759.892:933): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc234613f0 a2=3 a3=0 items=0 ppid=1 pid=5686 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=30 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:49:19.892000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:49:19.938644 kernel: audit: type=1327 audit(1769215759.892:933): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:49:19.940178 systemd[1]: Started session-30.scope - Session 30 of User core. Jan 24 00:49:19.952000 audit[5686]: USER_START pid=5686 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:19.960109 kernel: audit: type=1105 audit(1769215759.952:934): pid=5686 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:19.965000 audit[5692]: CRED_ACQ pid=5692 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:19.972132 kernel: audit: type=1103 audit(1769215759.965:935): pid=5692 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:20.332792 sshd[5692]: Connection closed by 4.153.228.146 port 48454 Jan 24 00:49:20.333314 sshd-session[5686]: pam_unix(sshd:session): session closed for user core Jan 24 00:49:20.333000 audit[5686]: USER_END pid=5686 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:20.338234 systemd[1]: sshd@28-65.109.167.77:22-4.153.228.146:48454.service: Deactivated successfully. Jan 24 00:49:20.342261 kernel: audit: type=1106 audit(1769215760.333:936): pid=5686 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:20.342583 systemd[1]: session-30.scope: Deactivated successfully. Jan 24 00:49:20.343614 systemd-logind[1656]: Session 30 logged out. Waiting for processes to exit. Jan 24 00:49:20.334000 audit[5686]: CRED_DISP pid=5686 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:20.347622 systemd-logind[1656]: Removed session 30. Jan 24 00:49:20.353113 kernel: audit: type=1104 audit(1769215760.334:937): pid=5686 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:20.336000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@28-65.109.167.77:22-4.153.228.146:48454 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:49:21.619483 kubelet[2863]: E0124 00:49:21.618862 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" podUID="7fb26181-9fdc-4f96-be2c-85fbaa5f21b7" Jan 24 00:49:21.620669 kubelet[2863]: E0124 00:49:21.620214 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:49:23.618877 kubelet[2863]: E0124 00:49:23.618839 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qhldj" podUID="d0cc0ca8-4b85-478b-b9c2-c61b42d93c89" Jan 24 00:49:25.463000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@29-65.109.167.77:22-4.153.228.146:45242 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:49:25.464939 systemd[1]: Started sshd@29-65.109.167.77:22-4.153.228.146:45242.service - OpenSSH per-connection server daemon (4.153.228.146:45242). Jan 24 00:49:25.466076 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:49:25.466129 kernel: audit: type=1130 audit(1769215765.463:939): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@29-65.109.167.77:22-4.153.228.146:45242 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:49:25.622365 kubelet[2863]: E0124 00:49:25.622054 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58c54c478-wd6fd" podUID="dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5" Jan 24 00:49:26.126000 audit[5721]: USER_ACCT pid=5721 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:26.134777 sshd-session[5721]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:49:26.136910 sshd[5721]: Accepted publickey for core from 4.153.228.146 port 45242 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:49:26.131000 audit[5721]: CRED_ACQ pid=5721 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:26.146758 kernel: audit: type=1101 audit(1769215766.126:940): pid=5721 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:26.146849 kernel: audit: type=1103 audit(1769215766.131:941): pid=5721 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:26.151395 systemd-logind[1656]: New session 31 of user core. Jan 24 00:49:26.160596 kernel: audit: type=1006 audit(1769215766.131:942): pid=5721 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=31 res=1 Jan 24 00:49:26.131000 audit[5721]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe3a937050 a2=3 a3=0 items=0 ppid=1 pid=5721 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=31 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:49:26.171727 kernel: audit: type=1300 audit(1769215766.131:942): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe3a937050 a2=3 a3=0 items=0 ppid=1 pid=5721 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=31 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:49:26.172379 systemd[1]: Started session-31.scope - Session 31 of User core. Jan 24 00:49:26.183791 kernel: audit: type=1327 audit(1769215766.131:942): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:49:26.131000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:49:26.190000 audit[5721]: USER_START pid=5721 uid=0 auid=500 ses=31 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:26.206109 kernel: audit: type=1105 audit(1769215766.190:943): pid=5721 uid=0 auid=500 ses=31 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:26.207000 audit[5726]: CRED_ACQ pid=5726 uid=0 auid=500 ses=31 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:26.218113 kernel: audit: type=1103 audit(1769215766.207:944): pid=5726 uid=0 auid=500 ses=31 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:26.623301 sshd[5726]: Connection closed by 4.153.228.146 port 45242 Jan 24 00:49:26.622614 sshd-session[5721]: pam_unix(sshd:session): session closed for user core Jan 24 00:49:26.635715 kernel: audit: type=1106 audit(1769215766.626:945): pid=5721 uid=0 auid=500 ses=31 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:26.626000 audit[5721]: USER_END pid=5721 uid=0 auid=500 ses=31 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:26.631678 systemd[1]: sshd@29-65.109.167.77:22-4.153.228.146:45242.service: Deactivated successfully. Jan 24 00:49:26.636495 systemd[1]: session-31.scope: Deactivated successfully. Jan 24 00:49:26.640936 systemd-logind[1656]: Session 31 logged out. Waiting for processes to exit. Jan 24 00:49:26.626000 audit[5721]: CRED_DISP pid=5721 uid=0 auid=500 ses=31 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:26.648412 systemd-logind[1656]: Removed session 31. Jan 24 00:49:26.629000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@29-65.109.167.77:22-4.153.228.146:45242 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:49:26.649151 kernel: audit: type=1104 audit(1769215766.626:946): pid=5721 uid=0 auid=500 ses=31 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:29.625506 kubelet[2863]: E0124 00:49:29.625432 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:49:31.770388 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:49:31.770515 kernel: audit: type=1130 audit(1769215771.763:948): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@30-65.109.167.77:22-4.153.228.146:45248 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:49:31.763000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@30-65.109.167.77:22-4.153.228.146:45248 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:49:31.764142 systemd[1]: Started sshd@30-65.109.167.77:22-4.153.228.146:45248.service - OpenSSH per-connection server daemon (4.153.228.146:45248). Jan 24 00:49:32.463000 audit[5750]: USER_ACCT pid=5750 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:32.465103 sshd[5750]: Accepted publickey for core from 4.153.228.146 port 45248 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:49:32.469720 sshd-session[5750]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:49:32.479549 kernel: audit: type=1101 audit(1769215772.463:949): pid=5750 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:32.467000 audit[5750]: CRED_ACQ pid=5750 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:32.495109 kernel: audit: type=1103 audit(1769215772.467:950): pid=5750 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:32.467000 audit[5750]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fffce66c250 a2=3 a3=0 items=0 ppid=1 pid=5750 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=32 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:49:32.505672 systemd-logind[1656]: New session 32 of user core. Jan 24 00:49:32.513750 kernel: audit: type=1006 audit(1769215772.467:951): pid=5750 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=32 res=1 Jan 24 00:49:32.513844 kernel: audit: type=1300 audit(1769215772.467:951): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fffce66c250 a2=3 a3=0 items=0 ppid=1 pid=5750 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=32 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:49:32.467000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:49:32.526399 systemd[1]: Started session-32.scope - Session 32 of User core. Jan 24 00:49:32.531691 kernel: audit: type=1327 audit(1769215772.467:951): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:49:32.536000 audit[5750]: USER_START pid=5750 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:32.540000 audit[5754]: CRED_ACQ pid=5754 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:32.553471 kernel: audit: type=1105 audit(1769215772.536:952): pid=5750 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:32.553559 kernel: audit: type=1103 audit(1769215772.540:953): pid=5754 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:32.616711 kubelet[2863]: E0124 00:49:32.616592 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" podUID="7fb26181-9fdc-4f96-be2c-85fbaa5f21b7" Jan 24 00:49:32.617846 kubelet[2863]: E0124 00:49:32.617285 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" podUID="d83d59e3-6296-40d9-bb63-5a69b654ac0c" Jan 24 00:49:32.978799 sshd[5754]: Connection closed by 4.153.228.146 port 45248 Jan 24 00:49:32.980112 sshd-session[5750]: pam_unix(sshd:session): session closed for user core Jan 24 00:49:32.980000 audit[5750]: USER_END pid=5750 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:32.984003 systemd-logind[1656]: Session 32 logged out. Waiting for processes to exit. Jan 24 00:49:32.985878 systemd[1]: sshd@30-65.109.167.77:22-4.153.228.146:45248.service: Deactivated successfully. Jan 24 00:49:32.989152 kernel: audit: type=1106 audit(1769215772.980:954): pid=5750 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:32.990603 systemd[1]: session-32.scope: Deactivated successfully. Jan 24 00:49:32.980000 audit[5750]: CRED_DISP pid=5750 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:32.992664 systemd-logind[1656]: Removed session 32. Jan 24 00:49:32.983000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@30-65.109.167.77:22-4.153.228.146:45248 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:49:32.999543 kernel: audit: type=1104 audit(1769215772.980:955): pid=5750 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:35.619672 kubelet[2863]: E0124 00:49:35.619550 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:49:37.618974 kubelet[2863]: E0124 00:49:37.618310 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qhldj" podUID="d0cc0ca8-4b85-478b-b9c2-c61b42d93c89" Jan 24 00:49:38.114189 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:49:38.114276 kernel: audit: type=1130 audit(1769215778.106:957): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@31-65.109.167.77:22-4.153.228.146:42804 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:49:38.106000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@31-65.109.167.77:22-4.153.228.146:42804 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:49:38.107362 systemd[1]: Started sshd@31-65.109.167.77:22-4.153.228.146:42804.service - OpenSSH per-connection server daemon (4.153.228.146:42804). Jan 24 00:49:38.766000 audit[5791]: USER_ACCT pid=5791 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:38.768652 sshd[5791]: Accepted publickey for core from 4.153.228.146 port 42804 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:49:38.777249 sshd-session[5791]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:49:38.784105 kernel: audit: type=1101 audit(1769215778.766:958): pid=5791 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:38.772000 audit[5791]: CRED_ACQ pid=5791 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:38.796009 systemd-logind[1656]: New session 33 of user core. Jan 24 00:49:38.803752 kernel: audit: type=1103 audit(1769215778.772:959): pid=5791 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:38.803857 kernel: audit: type=1006 audit(1769215778.773:960): pid=5791 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=33 res=1 Jan 24 00:49:38.773000 audit[5791]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff43efb780 a2=3 a3=0 items=0 ppid=1 pid=5791 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=33 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:49:38.826145 kernel: audit: type=1300 audit(1769215778.773:960): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff43efb780 a2=3 a3=0 items=0 ppid=1 pid=5791 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=33 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:49:38.811524 systemd[1]: Started session-33.scope - Session 33 of User core. Jan 24 00:49:38.773000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:49:38.842002 kernel: audit: type=1327 audit(1769215778.773:960): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:49:38.842205 kernel: audit: type=1105 audit(1769215778.830:961): pid=5791 uid=0 auid=500 ses=33 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:38.830000 audit[5791]: USER_START pid=5791 uid=0 auid=500 ses=33 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:38.845000 audit[5795]: CRED_ACQ pid=5795 uid=0 auid=500 ses=33 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:38.852097 kernel: audit: type=1103 audit(1769215778.845:962): pid=5795 uid=0 auid=500 ses=33 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:39.262359 sshd[5795]: Connection closed by 4.153.228.146 port 42804 Jan 24 00:49:39.265226 sshd-session[5791]: pam_unix(sshd:session): session closed for user core Jan 24 00:49:39.265000 audit[5791]: USER_END pid=5791 uid=0 auid=500 ses=33 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:39.270022 systemd[1]: sshd@31-65.109.167.77:22-4.153.228.146:42804.service: Deactivated successfully. Jan 24 00:49:39.274106 kernel: audit: type=1106 audit(1769215779.265:963): pid=5791 uid=0 auid=500 ses=33 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:39.273276 systemd[1]: session-33.scope: Deactivated successfully. Jan 24 00:49:39.274203 systemd-logind[1656]: Session 33 logged out. Waiting for processes to exit. Jan 24 00:49:39.277386 systemd-logind[1656]: Removed session 33. Jan 24 00:49:39.265000 audit[5791]: CRED_DISP pid=5791 uid=0 auid=500 ses=33 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:39.284103 kernel: audit: type=1104 audit(1769215779.265:964): pid=5791 uid=0 auid=500 ses=33 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:39.269000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@31-65.109.167.77:22-4.153.228.146:42804 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:49:40.618982 kubelet[2863]: E0124 00:49:40.618887 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58c54c478-wd6fd" podUID="dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5" Jan 24 00:49:43.617759 kubelet[2863]: E0124 00:49:43.617677 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:49:44.401080 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:49:44.401184 kernel: audit: type=1130 audit(1769215784.395:966): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@32-65.109.167.77:22-4.153.228.146:42806 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:49:44.395000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@32-65.109.167.77:22-4.153.228.146:42806 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:49:44.396924 systemd[1]: Started sshd@32-65.109.167.77:22-4.153.228.146:42806.service - OpenSSH per-connection server daemon (4.153.228.146:42806). Jan 24 00:49:45.057410 sshd[5808]: Accepted publickey for core from 4.153.228.146 port 42806 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:49:45.056000 audit[5808]: USER_ACCT pid=5808 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:45.065092 kernel: audit: type=1101 audit(1769215785.056:967): pid=5808 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:45.065623 sshd-session[5808]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:49:45.064000 audit[5808]: CRED_ACQ pid=5808 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:45.074084 kernel: audit: type=1103 audit(1769215785.064:968): pid=5808 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:45.064000 audit[5808]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff7be4c090 a2=3 a3=0 items=0 ppid=1 pid=5808 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=34 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:49:45.080495 kernel: audit: type=1006 audit(1769215785.064:969): pid=5808 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=34 res=1 Jan 24 00:49:45.080541 kernel: audit: type=1300 audit(1769215785.064:969): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff7be4c090 a2=3 a3=0 items=0 ppid=1 pid=5808 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=34 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:49:45.079224 systemd-logind[1656]: New session 34 of user core. Jan 24 00:49:45.064000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:49:45.085542 kernel: audit: type=1327 audit(1769215785.064:969): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:49:45.087526 systemd[1]: Started session-34.scope - Session 34 of User core. Jan 24 00:49:45.091000 audit[5808]: USER_START pid=5808 uid=0 auid=500 ses=34 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:45.100920 kernel: audit: type=1105 audit(1769215785.091:970): pid=5808 uid=0 auid=500 ses=34 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:45.100966 kernel: audit: type=1103 audit(1769215785.094:971): pid=5812 uid=0 auid=500 ses=34 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:45.094000 audit[5812]: CRED_ACQ pid=5812 uid=0 auid=500 ses=34 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:45.556103 sshd[5812]: Connection closed by 4.153.228.146 port 42806 Jan 24 00:49:45.556676 sshd-session[5808]: pam_unix(sshd:session): session closed for user core Jan 24 00:49:45.577456 kernel: audit: type=1106 audit(1769215785.557:972): pid=5808 uid=0 auid=500 ses=34 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:45.557000 audit[5808]: USER_END pid=5808 uid=0 auid=500 ses=34 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:45.565580 systemd[1]: sshd@32-65.109.167.77:22-4.153.228.146:42806.service: Deactivated successfully. Jan 24 00:49:45.571179 systemd[1]: session-34.scope: Deactivated successfully. Jan 24 00:49:45.593168 kernel: audit: type=1104 audit(1769215785.558:973): pid=5808 uid=0 auid=500 ses=34 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:45.558000 audit[5808]: CRED_DISP pid=5808 uid=0 auid=500 ses=34 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:45.563000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@32-65.109.167.77:22-4.153.228.146:42806 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:49:45.592505 systemd-logind[1656]: Session 34 logged out. Waiting for processes to exit. Jan 24 00:49:45.596160 systemd-logind[1656]: Removed session 34. Jan 24 00:49:45.619674 kubelet[2863]: E0124 00:49:45.618707 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" podUID="d83d59e3-6296-40d9-bb63-5a69b654ac0c" Jan 24 00:49:47.620858 kubelet[2863]: E0124 00:49:47.620739 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" podUID="7fb26181-9fdc-4f96-be2c-85fbaa5f21b7" Jan 24 00:49:48.617473 kubelet[2863]: E0124 00:49:48.617384 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qhldj" podUID="d0cc0ca8-4b85-478b-b9c2-c61b42d93c89" Jan 24 00:49:49.618389 kubelet[2863]: E0124 00:49:49.617756 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:49:50.696846 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:49:50.696959 kernel: audit: type=1130 audit(1769215790.688:975): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@33-65.109.167.77:22-4.153.228.146:49484 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:49:50.688000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@33-65.109.167.77:22-4.153.228.146:49484 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:49:50.689166 systemd[1]: Started sshd@33-65.109.167.77:22-4.153.228.146:49484.service - OpenSSH per-connection server daemon (4.153.228.146:49484). Jan 24 00:49:51.344000 audit[5826]: USER_ACCT pid=5826 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:51.350769 sshd-session[5826]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:49:51.351954 sshd[5826]: Accepted publickey for core from 4.153.228.146 port 49484 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:49:51.344000 audit[5826]: CRED_ACQ pid=5826 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:51.363469 kernel: audit: type=1101 audit(1769215791.344:976): pid=5826 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:51.363548 kernel: audit: type=1103 audit(1769215791.344:977): pid=5826 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:51.377085 kernel: audit: type=1006 audit(1769215791.344:978): pid=5826 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=35 res=1 Jan 24 00:49:51.344000 audit[5826]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd89a18fe0 a2=3 a3=0 items=0 ppid=1 pid=5826 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=35 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:49:51.392153 kernel: audit: type=1300 audit(1769215791.344:978): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd89a18fe0 a2=3 a3=0 items=0 ppid=1 pid=5826 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=35 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:49:51.391345 systemd-logind[1656]: New session 35 of user core. Jan 24 00:49:51.399058 kernel: audit: type=1327 audit(1769215791.344:978): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:49:51.344000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:49:51.403805 systemd[1]: Started session-35.scope - Session 35 of User core. Jan 24 00:49:51.411000 audit[5826]: USER_START pid=5826 uid=0 auid=500 ses=35 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:51.426108 kernel: audit: type=1105 audit(1769215791.411:979): pid=5826 uid=0 auid=500 ses=35 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:51.417000 audit[5834]: CRED_ACQ pid=5834 uid=0 auid=500 ses=35 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:51.438117 kernel: audit: type=1103 audit(1769215791.417:980): pid=5834 uid=0 auid=500 ses=35 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:51.837323 sshd[5834]: Connection closed by 4.153.228.146 port 49484 Jan 24 00:49:51.839668 sshd-session[5826]: pam_unix(sshd:session): session closed for user core Jan 24 00:49:51.843000 audit[5826]: USER_END pid=5826 uid=0 auid=500 ses=35 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:51.858151 kernel: audit: type=1106 audit(1769215791.843:981): pid=5826 uid=0 auid=500 ses=35 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:51.860096 systemd[1]: sshd@33-65.109.167.77:22-4.153.228.146:49484.service: Deactivated successfully. Jan 24 00:49:51.861833 systemd[1]: session-35.scope: Deactivated successfully. Jan 24 00:49:51.857000 audit[5826]: CRED_DISP pid=5826 uid=0 auid=500 ses=35 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:51.865398 systemd-logind[1656]: Session 35 logged out. Waiting for processes to exit. Jan 24 00:49:51.871122 kernel: audit: type=1104 audit(1769215791.857:982): pid=5826 uid=0 auid=500 ses=35 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:51.859000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@33-65.109.167.77:22-4.153.228.146:49484 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:49:51.871770 systemd-logind[1656]: Removed session 35. Jan 24 00:49:54.618127 kubelet[2863]: E0124 00:49:54.617901 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58c54c478-wd6fd" podUID="dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5" Jan 24 00:49:55.618025 kubelet[2863]: E0124 00:49:55.617979 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:49:56.615965 kubelet[2863]: E0124 00:49:56.615891 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" podUID="d83d59e3-6296-40d9-bb63-5a69b654ac0c" Jan 24 00:49:56.983246 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:49:56.983339 kernel: audit: type=1130 audit(1769215796.973:984): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@34-65.109.167.77:22-4.153.228.146:39696 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:49:56.973000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@34-65.109.167.77:22-4.153.228.146:39696 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:49:56.974309 systemd[1]: Started sshd@34-65.109.167.77:22-4.153.228.146:39696.service - OpenSSH per-connection server daemon (4.153.228.146:39696). Jan 24 00:49:57.641000 audit[5848]: USER_ACCT pid=5848 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:57.642877 sshd[5848]: Accepted publickey for core from 4.153.228.146 port 39696 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:49:57.646926 sshd-session[5848]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:49:57.649432 kernel: audit: type=1101 audit(1769215797.641:985): pid=5848 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:57.649486 kernel: audit: type=1103 audit(1769215797.644:986): pid=5848 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:57.644000 audit[5848]: CRED_ACQ pid=5848 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:57.657586 kernel: audit: type=1006 audit(1769215797.644:987): pid=5848 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=36 res=1 Jan 24 00:49:57.655236 systemd-logind[1656]: New session 36 of user core. Jan 24 00:49:57.644000 audit[5848]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffea5f5b4a0 a2=3 a3=0 items=0 ppid=1 pid=5848 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=36 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:49:57.659480 kernel: audit: type=1300 audit(1769215797.644:987): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffea5f5b4a0 a2=3 a3=0 items=0 ppid=1 pid=5848 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=36 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:49:57.644000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:49:57.664631 kernel: audit: type=1327 audit(1769215797.644:987): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:49:57.665262 systemd[1]: Started session-36.scope - Session 36 of User core. Jan 24 00:49:57.668000 audit[5848]: USER_START pid=5848 uid=0 auid=500 ses=36 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:57.671000 audit[5852]: CRED_ACQ pid=5852 uid=0 auid=500 ses=36 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:57.677770 kernel: audit: type=1105 audit(1769215797.668:988): pid=5848 uid=0 auid=500 ses=36 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:57.677839 kernel: audit: type=1103 audit(1769215797.671:989): pid=5852 uid=0 auid=500 ses=36 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:58.140752 sshd[5852]: Connection closed by 4.153.228.146 port 39696 Jan 24 00:49:58.141698 sshd-session[5848]: pam_unix(sshd:session): session closed for user core Jan 24 00:49:58.142000 audit[5848]: USER_END pid=5848 uid=0 auid=500 ses=36 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:58.150190 systemd-logind[1656]: Session 36 logged out. Waiting for processes to exit. Jan 24 00:49:58.152304 systemd[1]: sshd@34-65.109.167.77:22-4.153.228.146:39696.service: Deactivated successfully. Jan 24 00:49:58.155150 systemd[1]: session-36.scope: Deactivated successfully. Jan 24 00:49:58.157307 systemd-logind[1656]: Removed session 36. Jan 24 00:49:58.142000 audit[5848]: CRED_DISP pid=5848 uid=0 auid=500 ses=36 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:58.164013 kernel: audit: type=1106 audit(1769215798.142:990): pid=5848 uid=0 auid=500 ses=36 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:58.164237 kernel: audit: type=1104 audit(1769215798.142:991): pid=5848 uid=0 auid=500 ses=36 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:49:58.151000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@34-65.109.167.77:22-4.153.228.146:39696 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:49:59.618364 kubelet[2863]: E0124 00:49:59.617641 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" podUID="7fb26181-9fdc-4f96-be2c-85fbaa5f21b7" Jan 24 00:50:00.616544 kubelet[2863]: E0124 00:50:00.616467 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qhldj" podUID="d0cc0ca8-4b85-478b-b9c2-c61b42d93c89" Jan 24 00:50:03.275782 systemd[1]: Started sshd@35-65.109.167.77:22-4.153.228.146:39710.service - OpenSSH per-connection server daemon (4.153.228.146:39710). Jan 24 00:50:03.276000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@35-65.109.167.77:22-4.153.228.146:39710 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:50:03.284485 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:50:03.284585 kernel: audit: type=1130 audit(1769215803.276:993): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@35-65.109.167.77:22-4.153.228.146:39710 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:50:03.944000 audit[5891]: USER_ACCT pid=5891 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:03.947851 sshd-session[5891]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:50:03.949478 sshd[5891]: Accepted publickey for core from 4.153.228.146 port 39710 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:50:03.958190 kernel: audit: type=1101 audit(1769215803.944:994): pid=5891 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:03.944000 audit[5891]: CRED_ACQ pid=5891 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:03.967666 systemd-logind[1656]: New session 37 of user core. Jan 24 00:50:03.970946 kernel: audit: type=1103 audit(1769215803.944:995): pid=5891 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:03.970996 kernel: audit: type=1006 audit(1769215803.944:996): pid=5891 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=37 res=1 Jan 24 00:50:03.944000 audit[5891]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fffed8c8570 a2=3 a3=0 items=0 ppid=1 pid=5891 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=37 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:50:03.979013 kernel: audit: type=1300 audit(1769215803.944:996): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fffed8c8570 a2=3 a3=0 items=0 ppid=1 pid=5891 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=37 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:50:03.944000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:50:03.989325 kernel: audit: type=1327 audit(1769215803.944:996): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:50:03.990226 systemd[1]: Started session-37.scope - Session 37 of User core. Jan 24 00:50:03.993000 audit[5891]: USER_START pid=5891 uid=0 auid=500 ses=37 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:03.993000 audit[5895]: CRED_ACQ pid=5895 uid=0 auid=500 ses=37 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:04.009958 kernel: audit: type=1105 audit(1769215803.993:997): pid=5891 uid=0 auid=500 ses=37 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:04.010003 kernel: audit: type=1103 audit(1769215803.993:998): pid=5895 uid=0 auid=500 ses=37 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:04.421409 sshd[5895]: Connection closed by 4.153.228.146 port 39710 Jan 24 00:50:04.423014 sshd-session[5891]: pam_unix(sshd:session): session closed for user core Jan 24 00:50:04.449546 kernel: audit: type=1106 audit(1769215804.427:999): pid=5891 uid=0 auid=500 ses=37 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:04.427000 audit[5891]: USER_END pid=5891 uid=0 auid=500 ses=37 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:04.434605 systemd[1]: sshd@35-65.109.167.77:22-4.153.228.146:39710.service: Deactivated successfully. Jan 24 00:50:04.436545 systemd-logind[1656]: Session 37 logged out. Waiting for processes to exit. Jan 24 00:50:04.440401 systemd[1]: session-37.scope: Deactivated successfully. Jan 24 00:50:04.447140 systemd-logind[1656]: Removed session 37. Jan 24 00:50:04.428000 audit[5891]: CRED_DISP pid=5891 uid=0 auid=500 ses=37 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:04.434000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@35-65.109.167.77:22-4.153.228.146:39710 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:50:04.467142 kernel: audit: type=1104 audit(1769215804.428:1000): pid=5891 uid=0 auid=500 ses=37 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:04.618011 kubelet[2863]: E0124 00:50:04.617546 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:50:06.615363 kubelet[2863]: E0124 00:50:06.615324 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:50:09.556313 systemd[1]: Started sshd@36-65.109.167.77:22-4.153.228.146:35616.service - OpenSSH per-connection server daemon (4.153.228.146:35616). Jan 24 00:50:09.560505 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:50:09.560534 kernel: audit: type=1130 audit(1769215809.555:1002): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@36-65.109.167.77:22-4.153.228.146:35616 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:50:09.555000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@36-65.109.167.77:22-4.153.228.146:35616 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:50:09.617823 kubelet[2863]: E0124 00:50:09.617784 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58c54c478-wd6fd" podUID="dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5" Jan 24 00:50:10.207000 audit[5907]: USER_ACCT pid=5907 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:10.210906 sshd-session[5907]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:50:10.211518 sshd[5907]: Accepted publickey for core from 4.153.228.146 port 35616 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:50:10.215784 kernel: audit: type=1101 audit(1769215810.207:1003): pid=5907 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:10.215846 kernel: audit: type=1103 audit(1769215810.207:1004): pid=5907 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:10.207000 audit[5907]: CRED_ACQ pid=5907 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:10.217635 systemd-logind[1656]: New session 38 of user core. Jan 24 00:50:10.221486 kernel: audit: type=1006 audit(1769215810.207:1005): pid=5907 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=38 res=1 Jan 24 00:50:10.207000 audit[5907]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffcf3b89820 a2=3 a3=0 items=0 ppid=1 pid=5907 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=38 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:50:10.224182 kernel: audit: type=1300 audit(1769215810.207:1005): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffcf3b89820 a2=3 a3=0 items=0 ppid=1 pid=5907 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=38 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:50:10.207000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:50:10.232513 systemd[1]: Started session-38.scope - Session 38 of User core. Jan 24 00:50:10.234071 kernel: audit: type=1327 audit(1769215810.207:1005): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:50:10.236000 audit[5907]: USER_START pid=5907 uid=0 auid=500 ses=38 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:10.238000 audit[5911]: CRED_ACQ pid=5911 uid=0 auid=500 ses=38 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:10.245444 kernel: audit: type=1105 audit(1769215810.236:1006): pid=5907 uid=0 auid=500 ses=38 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:10.245548 kernel: audit: type=1103 audit(1769215810.238:1007): pid=5911 uid=0 auid=500 ses=38 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:10.617052 kubelet[2863]: E0124 00:50:10.616977 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" podUID="d83d59e3-6296-40d9-bb63-5a69b654ac0c" Jan 24 00:50:10.617684 kubelet[2863]: E0124 00:50:10.617633 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" podUID="7fb26181-9fdc-4f96-be2c-85fbaa5f21b7" Jan 24 00:50:10.646841 sshd[5911]: Connection closed by 4.153.228.146 port 35616 Jan 24 00:50:10.647518 sshd-session[5907]: pam_unix(sshd:session): session closed for user core Jan 24 00:50:10.668255 kernel: audit: type=1106 audit(1769215810.649:1008): pid=5907 uid=0 auid=500 ses=38 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:10.649000 audit[5907]: USER_END pid=5907 uid=0 auid=500 ses=38 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:10.656882 systemd-logind[1656]: Session 38 logged out. Waiting for processes to exit. Jan 24 00:50:10.658758 systemd[1]: sshd@36-65.109.167.77:22-4.153.228.146:35616.service: Deactivated successfully. Jan 24 00:50:10.649000 audit[5907]: CRED_DISP pid=5907 uid=0 auid=500 ses=38 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:10.665324 systemd[1]: session-38.scope: Deactivated successfully. Jan 24 00:50:10.675712 systemd-logind[1656]: Removed session 38. Jan 24 00:50:10.654000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@36-65.109.167.77:22-4.153.228.146:35616 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:50:10.682125 kernel: audit: type=1104 audit(1769215810.649:1009): pid=5907 uid=0 auid=500 ses=38 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:13.619889 kubelet[2863]: E0124 00:50:13.619797 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qhldj" podUID="d0cc0ca8-4b85-478b-b9c2-c61b42d93c89" Jan 24 00:50:15.619189 kubelet[2863]: E0124 00:50:15.619127 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:50:15.790263 systemd[1]: Started sshd@37-65.109.167.77:22-4.153.228.146:38608.service - OpenSSH per-connection server daemon (4.153.228.146:38608). Jan 24 00:50:15.789000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@37-65.109.167.77:22-4.153.228.146:38608 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:50:15.792761 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:50:15.792853 kernel: audit: type=1130 audit(1769215815.789:1011): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@37-65.109.167.77:22-4.153.228.146:38608 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:50:16.491000 audit[5925]: USER_ACCT pid=5925 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:16.494181 sshd[5925]: Accepted publickey for core from 4.153.228.146 port 38608 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:50:16.498322 sshd-session[5925]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:50:16.500151 kernel: audit: type=1101 audit(1769215816.491:1012): pid=5925 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:16.494000 audit[5925]: CRED_ACQ pid=5925 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:16.509557 systemd-logind[1656]: New session 39 of user core. Jan 24 00:50:16.511443 kernel: audit: type=1103 audit(1769215816.494:1013): pid=5925 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:16.523669 kernel: audit: type=1006 audit(1769215816.494:1014): pid=5925 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=39 res=1 Jan 24 00:50:16.523783 kernel: audit: type=1300 audit(1769215816.494:1014): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd10147d00 a2=3 a3=0 items=0 ppid=1 pid=5925 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=39 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:50:16.494000 audit[5925]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd10147d00 a2=3 a3=0 items=0 ppid=1 pid=5925 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=39 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:50:16.518980 systemd[1]: Started session-39.scope - Session 39 of User core. Jan 24 00:50:16.494000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:50:16.527085 kernel: audit: type=1327 audit(1769215816.494:1014): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:50:16.532000 audit[5925]: USER_START pid=5925 uid=0 auid=500 ses=39 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:16.541104 kernel: audit: type=1105 audit(1769215816.532:1015): pid=5925 uid=0 auid=500 ses=39 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:16.536000 audit[5929]: CRED_ACQ pid=5929 uid=0 auid=500 ses=39 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:16.547095 kernel: audit: type=1103 audit(1769215816.536:1016): pid=5929 uid=0 auid=500 ses=39 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:16.965609 sshd[5929]: Connection closed by 4.153.228.146 port 38608 Jan 24 00:50:16.967360 sshd-session[5925]: pam_unix(sshd:session): session closed for user core Jan 24 00:50:16.967000 audit[5925]: USER_END pid=5925 uid=0 auid=500 ses=39 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:16.976573 kernel: audit: type=1106 audit(1769215816.967:1017): pid=5925 uid=0 auid=500 ses=39 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:16.978206 systemd[1]: sshd@37-65.109.167.77:22-4.153.228.146:38608.service: Deactivated successfully. Jan 24 00:50:16.967000 audit[5925]: CRED_DISP pid=5925 uid=0 auid=500 ses=39 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:16.984023 systemd[1]: session-39.scope: Deactivated successfully. Jan 24 00:50:16.987304 kernel: audit: type=1104 audit(1769215816.967:1018): pid=5925 uid=0 auid=500 ses=39 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:16.977000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@37-65.109.167.77:22-4.153.228.146:38608 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:50:16.987396 systemd-logind[1656]: Session 39 logged out. Waiting for processes to exit. Jan 24 00:50:16.988989 systemd-logind[1656]: Removed session 39. Jan 24 00:50:18.617399 kubelet[2863]: E0124 00:50:18.617234 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:50:19.043502 containerd[1682]: time="2026-01-24T00:50:19.043140774Z" level=info msg="container event discarded" container=ee137d8e0ed0b974e1f268f0cc9485fff902e487d3e0090d2f94923889985059 type=CONTAINER_CREATED_EVENT Jan 24 00:50:19.054843 containerd[1682]: time="2026-01-24T00:50:19.054770210Z" level=info msg="container event discarded" container=ee137d8e0ed0b974e1f268f0cc9485fff902e487d3e0090d2f94923889985059 type=CONTAINER_STARTED_EVENT Jan 24 00:50:19.084252 containerd[1682]: time="2026-01-24T00:50:19.084111176Z" level=info msg="container event discarded" container=5e403805f2b37e6beed50b73ff4fa198fe796cdc3b3127bd77c823c5105a34ff type=CONTAINER_CREATED_EVENT Jan 24 00:50:19.084252 containerd[1682]: time="2026-01-24T00:50:19.084170726Z" level=info msg="container event discarded" container=4dfe6ad20a6e7eb089b21391a61b294b137da1e41cd41bf1de11588cd19c0201 type=CONTAINER_CREATED_EVENT Jan 24 00:50:19.084252 containerd[1682]: time="2026-01-24T00:50:19.084184046Z" level=info msg="container event discarded" container=4dfe6ad20a6e7eb089b21391a61b294b137da1e41cd41bf1de11588cd19c0201 type=CONTAINER_STARTED_EVENT Jan 24 00:50:19.084252 containerd[1682]: time="2026-01-24T00:50:19.084206436Z" level=info msg="container event discarded" container=1e4853a363e27e94a2347401150b690736121ddb4a32eb68c5a0f10638226f15 type=CONTAINER_CREATED_EVENT Jan 24 00:50:19.084252 containerd[1682]: time="2026-01-24T00:50:19.084219696Z" level=info msg="container event discarded" container=1e4853a363e27e94a2347401150b690736121ddb4a32eb68c5a0f10638226f15 type=CONTAINER_STARTED_EVENT Jan 24 00:50:19.110826 containerd[1682]: time="2026-01-24T00:50:19.110660059Z" level=info msg="container event discarded" container=99f2a910656beb11375fc9a8f310ae1f10290d2aaad6ad995ed81d06bd4375ad type=CONTAINER_CREATED_EVENT Jan 24 00:50:19.124272 containerd[1682]: time="2026-01-24T00:50:19.124040666Z" level=info msg="container event discarded" container=bd71b9b62407b31439937f713f0c85f23cbb38b07cfc69f4ee61abd6c124d8ad type=CONTAINER_CREATED_EVENT Jan 24 00:50:19.179740 containerd[1682]: time="2026-01-24T00:50:19.179563134Z" level=info msg="container event discarded" container=5e403805f2b37e6beed50b73ff4fa198fe796cdc3b3127bd77c823c5105a34ff type=CONTAINER_STARTED_EVENT Jan 24 00:50:19.223329 containerd[1682]: time="2026-01-24T00:50:19.223242056Z" level=info msg="container event discarded" container=99f2a910656beb11375fc9a8f310ae1f10290d2aaad6ad995ed81d06bd4375ad type=CONTAINER_STARTED_EVENT Jan 24 00:50:19.248853 containerd[1682]: time="2026-01-24T00:50:19.248757330Z" level=info msg="container event discarded" container=bd71b9b62407b31439937f713f0c85f23cbb38b07cfc69f4ee61abd6c124d8ad type=CONTAINER_STARTED_EVENT Jan 24 00:50:21.617942 kubelet[2863]: E0124 00:50:21.617200 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58c54c478-wd6fd" podUID="dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5" Jan 24 00:50:22.117103 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:50:22.117259 kernel: audit: type=1130 audit(1769215822.106:1020): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@38-65.109.167.77:22-4.153.228.146:38616 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:50:22.106000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@38-65.109.167.77:22-4.153.228.146:38616 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:50:22.107551 systemd[1]: Started sshd@38-65.109.167.77:22-4.153.228.146:38616.service - OpenSSH per-connection server daemon (4.153.228.146:38616). Jan 24 00:50:22.618583 kubelet[2863]: E0124 00:50:22.618513 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" podUID="d83d59e3-6296-40d9-bb63-5a69b654ac0c" Jan 24 00:50:22.794000 audit[5942]: USER_ACCT pid=5942 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:22.803084 kernel: audit: type=1101 audit(1769215822.794:1021): pid=5942 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:22.803150 kernel: audit: type=1103 audit(1769215822.800:1022): pid=5942 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:22.800000 audit[5942]: CRED_ACQ pid=5942 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:22.803233 sshd[5942]: Accepted publickey for core from 4.153.228.146 port 38616 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:50:22.804076 sshd-session[5942]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:50:22.812636 systemd-logind[1656]: New session 40 of user core. Jan 24 00:50:22.800000 audit[5942]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc1f94fff0 a2=3 a3=0 items=0 ppid=1 pid=5942 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=40 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:50:22.815666 kernel: audit: type=1006 audit(1769215822.800:1023): pid=5942 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=40 res=1 Jan 24 00:50:22.815718 kernel: audit: type=1300 audit(1769215822.800:1023): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc1f94fff0 a2=3 a3=0 items=0 ppid=1 pid=5942 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=40 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:50:22.800000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:50:22.821170 kernel: audit: type=1327 audit(1769215822.800:1023): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:50:22.822250 systemd[1]: Started session-40.scope - Session 40 of User core. Jan 24 00:50:22.825000 audit[5942]: USER_START pid=5942 uid=0 auid=500 ses=40 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:22.833000 audit[5946]: CRED_ACQ pid=5946 uid=0 auid=500 ses=40 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:22.835930 kernel: audit: type=1105 audit(1769215822.825:1024): pid=5942 uid=0 auid=500 ses=40 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:22.835964 kernel: audit: type=1103 audit(1769215822.833:1025): pid=5946 uid=0 auid=500 ses=40 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:23.263733 sshd[5946]: Connection closed by 4.153.228.146 port 38616 Jan 24 00:50:23.266384 sshd-session[5942]: pam_unix(sshd:session): session closed for user core Jan 24 00:50:23.268000 audit[5942]: USER_END pid=5942 uid=0 auid=500 ses=40 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:23.268000 audit[5942]: CRED_DISP pid=5942 uid=0 auid=500 ses=40 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:23.292297 kernel: audit: type=1106 audit(1769215823.268:1026): pid=5942 uid=0 auid=500 ses=40 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:23.292375 kernel: audit: type=1104 audit(1769215823.268:1027): pid=5942 uid=0 auid=500 ses=40 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:23.294615 systemd[1]: sshd@38-65.109.167.77:22-4.153.228.146:38616.service: Deactivated successfully. Jan 24 00:50:23.295000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@38-65.109.167.77:22-4.153.228.146:38616 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:50:23.308183 systemd[1]: session-40.scope: Deactivated successfully. Jan 24 00:50:23.311826 systemd-logind[1656]: Session 40 logged out. Waiting for processes to exit. Jan 24 00:50:23.315023 systemd-logind[1656]: Removed session 40. Jan 24 00:50:25.619342 kubelet[2863]: E0124 00:50:25.619277 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" podUID="7fb26181-9fdc-4f96-be2c-85fbaa5f21b7" Jan 24 00:50:26.617609 kubelet[2863]: E0124 00:50:26.617424 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qhldj" podUID="d0cc0ca8-4b85-478b-b9c2-c61b42d93c89" Jan 24 00:50:26.619673 kubelet[2863]: E0124 00:50:26.619407 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:50:28.397945 systemd[1]: Started sshd@39-65.109.167.77:22-4.153.228.146:55538.service - OpenSSH per-connection server daemon (4.153.228.146:55538). Jan 24 00:50:28.397000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@39-65.109.167.77:22-4.153.228.146:55538 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:50:28.399594 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:50:28.399684 kernel: audit: type=1130 audit(1769215828.397:1029): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@39-65.109.167.77:22-4.153.228.146:55538 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:50:28.998530 containerd[1682]: time="2026-01-24T00:50:28.998420974Z" level=info msg="container event discarded" container=572656723705203c6175f044f27e219d95ffc53f3d005481a4bfcf9c92863782 type=CONTAINER_CREATED_EVENT Jan 24 00:50:28.998530 containerd[1682]: time="2026-01-24T00:50:28.998484653Z" level=info msg="container event discarded" container=572656723705203c6175f044f27e219d95ffc53f3d005481a4bfcf9c92863782 type=CONTAINER_STARTED_EVENT Jan 24 00:50:29.032825 containerd[1682]: time="2026-01-24T00:50:29.032724645Z" level=info msg="container event discarded" container=4d175fe3487afeef0cf111bfce2b950a73cfc045085f56ee0deb3163aa82107f type=CONTAINER_CREATED_EVENT Jan 24 00:50:29.083000 audit[5962]: USER_ACCT pid=5962 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:29.101374 kernel: audit: type=1101 audit(1769215829.083:1030): pid=5962 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:29.103669 sshd[5962]: Accepted publickey for core from 4.153.228.146 port 55538 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:50:29.109273 sshd-session[5962]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:50:29.103000 audit[5962]: CRED_ACQ pid=5962 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:29.140649 kernel: audit: type=1103 audit(1769215829.103:1031): pid=5962 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:29.144276 systemd-logind[1656]: New session 41 of user core. Jan 24 00:50:29.151443 systemd[1]: Started session-41.scope - Session 41 of User core. Jan 24 00:50:29.152135 kernel: audit: type=1006 audit(1769215829.103:1032): pid=5962 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=41 res=1 Jan 24 00:50:29.103000 audit[5962]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe27fd3570 a2=3 a3=0 items=0 ppid=1 pid=5962 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=41 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:50:29.164131 kernel: audit: type=1300 audit(1769215829.103:1032): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe27fd3570 a2=3 a3=0 items=0 ppid=1 pid=5962 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=41 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:50:29.103000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:50:29.172157 kernel: audit: type=1327 audit(1769215829.103:1032): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:50:29.180156 kernel: audit: type=1105 audit(1769215829.162:1033): pid=5962 uid=0 auid=500 ses=41 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:29.162000 audit[5962]: USER_START pid=5962 uid=0 auid=500 ses=41 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:29.180304 containerd[1682]: time="2026-01-24T00:50:29.176525328Z" level=info msg="container event discarded" container=4d175fe3487afeef0cf111bfce2b950a73cfc045085f56ee0deb3163aa82107f type=CONTAINER_STARTED_EVENT Jan 24 00:50:29.164000 audit[5966]: CRED_ACQ pid=5966 uid=0 auid=500 ses=41 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:29.192086 kernel: audit: type=1103 audit(1769215829.164:1034): pid=5966 uid=0 auid=500 ses=41 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:29.291599 containerd[1682]: time="2026-01-24T00:50:29.290812349Z" level=info msg="container event discarded" container=c95907dae8b1dd0b335eb404719b5ebdd429e35f13988537f8334ea5c995b8da type=CONTAINER_CREATED_EVENT Jan 24 00:50:29.291599 containerd[1682]: time="2026-01-24T00:50:29.290850299Z" level=info msg="container event discarded" container=c95907dae8b1dd0b335eb404719b5ebdd429e35f13988537f8334ea5c995b8da type=CONTAINER_STARTED_EVENT Jan 24 00:50:29.522281 sshd[5966]: Connection closed by 4.153.228.146 port 55538 Jan 24 00:50:29.524280 sshd-session[5962]: pam_unix(sshd:session): session closed for user core Jan 24 00:50:29.533138 kernel: audit: type=1106 audit(1769215829.524:1035): pid=5962 uid=0 auid=500 ses=41 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:29.524000 audit[5962]: USER_END pid=5962 uid=0 auid=500 ses=41 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:29.527439 systemd-logind[1656]: Session 41 logged out. Waiting for processes to exit. Jan 24 00:50:29.528938 systemd[1]: sshd@39-65.109.167.77:22-4.153.228.146:55538.service: Deactivated successfully. Jan 24 00:50:29.531981 systemd[1]: session-41.scope: Deactivated successfully. Jan 24 00:50:29.533355 systemd-logind[1656]: Removed session 41. Jan 24 00:50:29.524000 audit[5962]: CRED_DISP pid=5962 uid=0 auid=500 ses=41 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:29.527000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@39-65.109.167.77:22-4.153.228.146:55538 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:50:29.541371 kernel: audit: type=1104 audit(1769215829.524:1036): pid=5962 uid=0 auid=500 ses=41 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:31.325043 systemd[1805]: Created slice background.slice - User Background Tasks Slice. Jan 24 00:50:31.328572 systemd[1805]: Starting systemd-tmpfiles-clean.service - Cleanup of User's Temporary Files and Directories... Jan 24 00:50:31.362512 systemd[1805]: Finished systemd-tmpfiles-clean.service - Cleanup of User's Temporary Files and Directories. Jan 24 00:50:31.598710 containerd[1682]: time="2026-01-24T00:50:31.598505687Z" level=info msg="container event discarded" container=fb20456a76a221b5ce242f0d74a9ba6ac0ba3bc345c3042096c7036a3f9c13ce type=CONTAINER_CREATED_EVENT Jan 24 00:50:31.646484 containerd[1682]: time="2026-01-24T00:50:31.646398033Z" level=info msg="container event discarded" container=fb20456a76a221b5ce242f0d74a9ba6ac0ba3bc345c3042096c7036a3f9c13ce type=CONTAINER_STARTED_EVENT Jan 24 00:50:33.625451 kubelet[2863]: E0124 00:50:33.625379 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:50:34.617086 kubelet[2863]: E0124 00:50:34.617021 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58c54c478-wd6fd" podUID="dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5" Jan 24 00:50:34.667898 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:50:34.668012 kernel: audit: type=1130 audit(1769215834.661:1038): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@40-65.109.167.77:22-4.153.228.146:54344 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:50:34.661000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@40-65.109.167.77:22-4.153.228.146:54344 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:50:34.662290 systemd[1]: Started sshd@40-65.109.167.77:22-4.153.228.146:54344.service - OpenSSH per-connection server daemon (4.153.228.146:54344). Jan 24 00:50:35.348660 sshd[6006]: Accepted publickey for core from 4.153.228.146 port 54344 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:50:35.347000 audit[6006]: USER_ACCT pid=6006 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:35.353280 sshd-session[6006]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:50:35.355307 kernel: audit: type=1101 audit(1769215835.347:1039): pid=6006 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:35.350000 audit[6006]: CRED_ACQ pid=6006 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:35.366350 systemd-logind[1656]: New session 42 of user core. Jan 24 00:50:35.367101 kernel: audit: type=1103 audit(1769215835.350:1040): pid=6006 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:35.370304 systemd[1]: Started session-42.scope - Session 42 of User core. Jan 24 00:50:35.375115 kernel: audit: type=1006 audit(1769215835.350:1041): pid=6006 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=42 res=1 Jan 24 00:50:35.350000 audit[6006]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe08843750 a2=3 a3=0 items=0 ppid=1 pid=6006 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=42 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:50:35.383494 kernel: audit: type=1300 audit(1769215835.350:1041): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe08843750 a2=3 a3=0 items=0 ppid=1 pid=6006 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=42 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:50:35.350000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:50:35.377000 audit[6006]: USER_START pid=6006 uid=0 auid=500 ses=42 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:35.389234 kernel: audit: type=1327 audit(1769215835.350:1041): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:50:35.389278 kernel: audit: type=1105 audit(1769215835.377:1042): pid=6006 uid=0 auid=500 ses=42 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:35.382000 audit[6010]: CRED_ACQ pid=6010 uid=0 auid=500 ses=42 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:35.401079 kernel: audit: type=1103 audit(1769215835.382:1043): pid=6010 uid=0 auid=500 ses=42 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:35.787729 sshd[6010]: Connection closed by 4.153.228.146 port 54344 Jan 24 00:50:35.788878 sshd-session[6006]: pam_unix(sshd:session): session closed for user core Jan 24 00:50:35.790000 audit[6006]: USER_END pid=6006 uid=0 auid=500 ses=42 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:35.794451 systemd[1]: sshd@40-65.109.167.77:22-4.153.228.146:54344.service: Deactivated successfully. Jan 24 00:50:35.798096 kernel: audit: type=1106 audit(1769215835.790:1044): pid=6006 uid=0 auid=500 ses=42 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:35.797644 systemd[1]: session-42.scope: Deactivated successfully. Jan 24 00:50:35.790000 audit[6006]: CRED_DISP pid=6006 uid=0 auid=500 ses=42 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:35.801328 systemd-logind[1656]: Session 42 logged out. Waiting for processes to exit. Jan 24 00:50:35.802270 systemd-logind[1656]: Removed session 42. Jan 24 00:50:35.792000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@40-65.109.167.77:22-4.153.228.146:54344 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:50:35.804103 kernel: audit: type=1104 audit(1769215835.790:1045): pid=6006 uid=0 auid=500 ses=42 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:36.616804 kubelet[2863]: E0124 00:50:36.616702 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" podUID="d83d59e3-6296-40d9-bb63-5a69b654ac0c" Jan 24 00:50:37.617659 kubelet[2863]: E0124 00:50:37.617544 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" podUID="7fb26181-9fdc-4f96-be2c-85fbaa5f21b7" Jan 24 00:50:37.620133 kubelet[2863]: E0124 00:50:37.619020 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:50:40.934156 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:50:40.934345 kernel: audit: type=1130 audit(1769215840.930:1047): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@41-65.109.167.77:22-4.153.228.146:54352 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:50:40.930000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@41-65.109.167.77:22-4.153.228.146:54352 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:50:40.931287 systemd[1]: Started sshd@41-65.109.167.77:22-4.153.228.146:54352.service - OpenSSH per-connection server daemon (4.153.228.146:54352). Jan 24 00:50:41.617122 kubelet[2863]: E0124 00:50:41.616753 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qhldj" podUID="d0cc0ca8-4b85-478b-b9c2-c61b42d93c89" Jan 24 00:50:41.627000 audit[6023]: USER_ACCT pid=6023 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:41.631092 sshd[6023]: Accepted publickey for core from 4.153.228.146 port 54352 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:50:41.634036 sshd-session[6023]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:50:41.641109 kernel: audit: type=1101 audit(1769215841.627:1048): pid=6023 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:41.631000 audit[6023]: CRED_ACQ pid=6023 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:41.652694 systemd-logind[1656]: New session 43 of user core. Jan 24 00:50:41.657092 kernel: audit: type=1103 audit(1769215841.631:1049): pid=6023 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:41.661143 systemd[1]: Started session-43.scope - Session 43 of User core. Jan 24 00:50:41.664254 kernel: audit: type=1006 audit(1769215841.631:1050): pid=6023 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=43 res=1 Jan 24 00:50:41.631000 audit[6023]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd3003a960 a2=3 a3=0 items=0 ppid=1 pid=6023 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=43 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:50:41.671088 kernel: audit: type=1300 audit(1769215841.631:1050): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd3003a960 a2=3 a3=0 items=0 ppid=1 pid=6023 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=43 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:50:41.631000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:50:41.677082 kernel: audit: type=1327 audit(1769215841.631:1050): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:50:41.664000 audit[6023]: USER_START pid=6023 uid=0 auid=500 ses=43 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:41.686090 kernel: audit: type=1105 audit(1769215841.664:1051): pid=6023 uid=0 auid=500 ses=43 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:41.670000 audit[6027]: CRED_ACQ pid=6027 uid=0 auid=500 ses=43 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:41.693116 kernel: audit: type=1103 audit(1769215841.670:1052): pid=6027 uid=0 auid=500 ses=43 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:42.055696 containerd[1682]: time="2026-01-24T00:50:42.055184621Z" level=info msg="container event discarded" container=b7042f6baaf09302f1d7ae7217794267e477ef5d6c3e4348caed72c75bf9638b type=CONTAINER_CREATED_EVENT Jan 24 00:50:42.055696 containerd[1682]: time="2026-01-24T00:50:42.055233021Z" level=info msg="container event discarded" container=b7042f6baaf09302f1d7ae7217794267e477ef5d6c3e4348caed72c75bf9638b type=CONTAINER_STARTED_EVENT Jan 24 00:50:42.109911 containerd[1682]: time="2026-01-24T00:50:42.109849233Z" level=info msg="container event discarded" container=006723ff522a1ec89aa42472e96a2a108256b3c2490e6db62e86e8b1e883a13e type=CONTAINER_CREATED_EVENT Jan 24 00:50:42.109911 containerd[1682]: time="2026-01-24T00:50:42.109887403Z" level=info msg="container event discarded" container=006723ff522a1ec89aa42472e96a2a108256b3c2490e6db62e86e8b1e883a13e type=CONTAINER_STARTED_EVENT Jan 24 00:50:42.112349 sshd[6027]: Connection closed by 4.153.228.146 port 54352 Jan 24 00:50:42.111699 sshd-session[6023]: pam_unix(sshd:session): session closed for user core Jan 24 00:50:42.112000 audit[6023]: USER_END pid=6023 uid=0 auid=500 ses=43 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:42.121138 kernel: audit: type=1106 audit(1769215842.112:1053): pid=6023 uid=0 auid=500 ses=43 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:42.115000 audit[6023]: CRED_DISP pid=6023 uid=0 auid=500 ses=43 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:42.122182 systemd-logind[1656]: Session 43 logged out. Waiting for processes to exit. Jan 24 00:50:42.123735 systemd[1]: sshd@41-65.109.167.77:22-4.153.228.146:54352.service: Deactivated successfully. Jan 24 00:50:42.127042 systemd[1]: session-43.scope: Deactivated successfully. Jan 24 00:50:42.127167 kernel: audit: type=1104 audit(1769215842.115:1054): pid=6023 uid=0 auid=500 ses=43 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:42.122000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@41-65.109.167.77:22-4.153.228.146:54352 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:50:42.130827 systemd-logind[1656]: Removed session 43. Jan 24 00:50:45.287776 containerd[1682]: time="2026-01-24T00:50:45.287710053Z" level=info msg="container event discarded" container=738d29ed1e89a324bdb52fd8e5619a3acea6c553c143938ec62480d9ff877f28 type=CONTAINER_CREATED_EVENT Jan 24 00:50:45.357613 containerd[1682]: time="2026-01-24T00:50:45.357554744Z" level=info msg="container event discarded" container=738d29ed1e89a324bdb52fd8e5619a3acea6c553c143938ec62480d9ff877f28 type=CONTAINER_STARTED_EVENT Jan 24 00:50:46.619043 kubelet[2863]: E0124 00:50:46.618907 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58c54c478-wd6fd" podUID="dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5" Jan 24 00:50:47.188395 containerd[1682]: time="2026-01-24T00:50:47.188321160Z" level=info msg="container event discarded" container=b9006ec4314553ce5a7708a7bcc95caaa40a79266a6e5d8e801c20955c360c87 type=CONTAINER_CREATED_EVENT Jan 24 00:50:47.240000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@42-65.109.167.77:22-4.153.228.146:47576 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:50:47.241260 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:50:47.241324 kernel: audit: type=1130 audit(1769215847.240:1056): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@42-65.109.167.77:22-4.153.228.146:47576 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:50:47.240473 systemd[1]: Started sshd@42-65.109.167.77:22-4.153.228.146:47576.service - OpenSSH per-connection server daemon (4.153.228.146:47576). Jan 24 00:50:47.324801 containerd[1682]: time="2026-01-24T00:50:47.324579211Z" level=info msg="container event discarded" container=b9006ec4314553ce5a7708a7bcc95caaa40a79266a6e5d8e801c20955c360c87 type=CONTAINER_STARTED_EVENT Jan 24 00:50:47.509285 containerd[1682]: time="2026-01-24T00:50:47.509036448Z" level=info msg="container event discarded" container=b9006ec4314553ce5a7708a7bcc95caaa40a79266a6e5d8e801c20955c360c87 type=CONTAINER_STOPPED_EVENT Jan 24 00:50:47.912000 audit[6040]: USER_ACCT pid=6040 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:47.919508 sshd-session[6040]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:50:47.920188 kernel: audit: type=1101 audit(1769215847.912:1057): pid=6040 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:47.920226 sshd[6040]: Accepted publickey for core from 4.153.228.146 port 47576 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:50:47.912000 audit[6040]: CRED_ACQ pid=6040 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:47.931091 kernel: audit: type=1103 audit(1769215847.912:1058): pid=6040 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:47.934301 systemd-logind[1656]: New session 44 of user core. Jan 24 00:50:47.938080 kernel: audit: type=1006 audit(1769215847.915:1059): pid=6040 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=44 res=1 Jan 24 00:50:47.939528 systemd[1]: Started session-44.scope - Session 44 of User core. Jan 24 00:50:47.915000 audit[6040]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd425be7b0 a2=3 a3=0 items=0 ppid=1 pid=6040 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=44 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:50:47.915000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:50:47.952620 kernel: audit: type=1300 audit(1769215847.915:1059): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd425be7b0 a2=3 a3=0 items=0 ppid=1 pid=6040 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=44 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:50:47.952678 kernel: audit: type=1327 audit(1769215847.915:1059): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:50:47.943000 audit[6040]: USER_START pid=6040 uid=0 auid=500 ses=44 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:47.956370 kernel: audit: type=1105 audit(1769215847.943:1060): pid=6040 uid=0 auid=500 ses=44 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:47.950000 audit[6044]: CRED_ACQ pid=6044 uid=0 auid=500 ses=44 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:47.963121 kernel: audit: type=1103 audit(1769215847.950:1061): pid=6044 uid=0 auid=500 ses=44 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:48.347387 sshd[6044]: Connection closed by 4.153.228.146 port 47576 Jan 24 00:50:48.351225 sshd-session[6040]: pam_unix(sshd:session): session closed for user core Jan 24 00:50:48.352000 audit[6040]: USER_END pid=6040 uid=0 auid=500 ses=44 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:48.362106 kernel: audit: type=1106 audit(1769215848.352:1062): pid=6040 uid=0 auid=500 ses=44 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:48.362733 systemd[1]: sshd@42-65.109.167.77:22-4.153.228.146:47576.service: Deactivated successfully. Jan 24 00:50:48.364761 systemd[1]: session-44.scope: Deactivated successfully. Jan 24 00:50:48.367784 systemd-logind[1656]: Session 44 logged out. Waiting for processes to exit. Jan 24 00:50:48.368837 systemd-logind[1656]: Removed session 44. Jan 24 00:50:48.352000 audit[6040]: CRED_DISP pid=6040 uid=0 auid=500 ses=44 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:48.361000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@42-65.109.167.77:22-4.153.228.146:47576 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:50:48.375161 kernel: audit: type=1104 audit(1769215848.352:1063): pid=6040 uid=0 auid=500 ses=44 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:48.486000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@43-65.109.167.77:22-4.153.228.146:47588 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:50:48.486603 systemd[1]: Started sshd@43-65.109.167.77:22-4.153.228.146:47588.service - OpenSSH per-connection server daemon (4.153.228.146:47588). Jan 24 00:50:48.617753 kubelet[2863]: E0124 00:50:48.617615 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" podUID="7fb26181-9fdc-4f96-be2c-85fbaa5f21b7" Jan 24 00:50:48.620417 kubelet[2863]: E0124 00:50:48.619511 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:50:49.175000 audit[6057]: USER_ACCT pid=6057 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:49.177009 sshd[6057]: Accepted publickey for core from 4.153.228.146 port 47588 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:50:49.177000 audit[6057]: CRED_ACQ pid=6057 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:49.177000 audit[6057]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fffc45ce860 a2=3 a3=0 items=0 ppid=1 pid=6057 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=45 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:50:49.177000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:50:49.179255 sshd-session[6057]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:50:49.185090 systemd-logind[1656]: New session 45 of user core. Jan 24 00:50:49.192332 systemd[1]: Started session-45.scope - Session 45 of User core. Jan 24 00:50:49.195000 audit[6057]: USER_START pid=6057 uid=0 auid=500 ses=45 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:49.198000 audit[6061]: CRED_ACQ pid=6061 uid=0 auid=500 ses=45 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:49.617889 kubelet[2863]: E0124 00:50:49.617822 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:50:49.697203 sshd[6061]: Connection closed by 4.153.228.146 port 47588 Jan 24 00:50:49.698213 sshd-session[6057]: pam_unix(sshd:session): session closed for user core Jan 24 00:50:49.703000 audit[6057]: USER_END pid=6057 uid=0 auid=500 ses=45 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:49.703000 audit[6057]: CRED_DISP pid=6057 uid=0 auid=500 ses=45 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:49.709857 systemd-logind[1656]: Session 45 logged out. Waiting for processes to exit. Jan 24 00:50:49.711954 systemd[1]: sshd@43-65.109.167.77:22-4.153.228.146:47588.service: Deactivated successfully. Jan 24 00:50:49.712000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@43-65.109.167.77:22-4.153.228.146:47588 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:50:49.714835 systemd[1]: session-45.scope: Deactivated successfully. Jan 24 00:50:49.718286 systemd-logind[1656]: Removed session 45. Jan 24 00:50:49.830000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@44-65.109.167.77:22-4.153.228.146:47604 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:50:49.831436 systemd[1]: Started sshd@44-65.109.167.77:22-4.153.228.146:47604.service - OpenSSH per-connection server daemon (4.153.228.146:47604). Jan 24 00:50:50.496000 audit[6071]: USER_ACCT pid=6071 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:50.497531 sshd[6071]: Accepted publickey for core from 4.153.228.146 port 47604 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:50:50.498000 audit[6071]: CRED_ACQ pid=6071 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:50.498000 audit[6071]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffea84955c0 a2=3 a3=0 items=0 ppid=1 pid=6071 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=46 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:50:50.498000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:50:50.500109 sshd-session[6071]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:50:50.510225 systemd-logind[1656]: New session 46 of user core. Jan 24 00:50:50.517222 systemd[1]: Started session-46.scope - Session 46 of User core. Jan 24 00:50:50.524000 audit[6071]: USER_START pid=6071 uid=0 auid=500 ses=46 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:50.528000 audit[6075]: CRED_ACQ pid=6075 uid=0 auid=500 ses=46 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:50.998110 sshd[6075]: Connection closed by 4.153.228.146 port 47604 Jan 24 00:50:50.999993 sshd-session[6071]: pam_unix(sshd:session): session closed for user core Jan 24 00:50:51.005000 audit[6071]: USER_END pid=6071 uid=0 auid=500 ses=46 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:51.006000 audit[6071]: CRED_DISP pid=6071 uid=0 auid=500 ses=46 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:51.012311 systemd[1]: sshd@44-65.109.167.77:22-4.153.228.146:47604.service: Deactivated successfully. Jan 24 00:50:51.012000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@44-65.109.167.77:22-4.153.228.146:47604 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:50:51.017331 systemd[1]: session-46.scope: Deactivated successfully. Jan 24 00:50:51.021409 systemd-logind[1656]: Session 46 logged out. Waiting for processes to exit. Jan 24 00:50:51.025575 systemd-logind[1656]: Removed session 46. Jan 24 00:50:51.618846 kubelet[2863]: E0124 00:50:51.618801 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" podUID="d83d59e3-6296-40d9-bb63-5a69b654ac0c" Jan 24 00:50:51.918195 containerd[1682]: time="2026-01-24T00:50:51.918019499Z" level=info msg="container event discarded" container=bbca05c2e66c1ee017f51cb6b4f7d3ad14e4d7573c28bbd3bc34c7fe6f355fa4 type=CONTAINER_CREATED_EVENT Jan 24 00:50:52.095680 containerd[1682]: time="2026-01-24T00:50:52.095578367Z" level=info msg="container event discarded" container=bbca05c2e66c1ee017f51cb6b4f7d3ad14e4d7573c28bbd3bc34c7fe6f355fa4 type=CONTAINER_STARTED_EVENT Jan 24 00:50:52.785000 containerd[1682]: time="2026-01-24T00:50:52.784913877Z" level=info msg="container event discarded" container=bbca05c2e66c1ee017f51cb6b4f7d3ad14e4d7573c28bbd3bc34c7fe6f355fa4 type=CONTAINER_STOPPED_EVENT Jan 24 00:50:55.621700 kubelet[2863]: E0124 00:50:55.621599 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qhldj" podUID="d0cc0ca8-4b85-478b-b9c2-c61b42d93c89" Jan 24 00:50:56.137696 kernel: kauditd_printk_skb: 23 callbacks suppressed Jan 24 00:50:56.137780 kernel: audit: type=1130 audit(1769215856.129:1083): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@45-65.109.167.77:22-4.153.228.146:45678 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:50:56.129000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@45-65.109.167.77:22-4.153.228.146:45678 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:50:56.129333 systemd[1]: Started sshd@45-65.109.167.77:22-4.153.228.146:45678.service - OpenSSH per-connection server daemon (4.153.228.146:45678). Jan 24 00:50:56.797000 audit[6087]: USER_ACCT pid=6087 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:56.813221 kernel: audit: type=1101 audit(1769215856.797:1084): pid=6087 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:56.814039 sshd[6087]: Accepted publickey for core from 4.153.228.146 port 45678 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:50:56.816814 sshd-session[6087]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:50:56.814000 audit[6087]: CRED_ACQ pid=6087 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:56.836948 kernel: audit: type=1103 audit(1769215856.814:1085): pid=6087 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:56.837024 kernel: audit: type=1006 audit(1769215856.814:1086): pid=6087 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=47 res=1 Jan 24 00:50:56.814000 audit[6087]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff44282270 a2=3 a3=0 items=0 ppid=1 pid=6087 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=47 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:50:56.838417 systemd-logind[1656]: New session 47 of user core. Jan 24 00:50:56.814000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:50:56.847279 kernel: audit: type=1300 audit(1769215856.814:1086): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff44282270 a2=3 a3=0 items=0 ppid=1 pid=6087 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=47 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:50:56.847417 kernel: audit: type=1327 audit(1769215856.814:1086): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:50:56.850281 systemd[1]: Started session-47.scope - Session 47 of User core. Jan 24 00:50:56.855000 audit[6087]: USER_START pid=6087 uid=0 auid=500 ses=47 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:56.863098 kernel: audit: type=1105 audit(1769215856.855:1087): pid=6087 uid=0 auid=500 ses=47 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:56.863000 audit[6091]: CRED_ACQ pid=6091 uid=0 auid=500 ses=47 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:56.871090 kernel: audit: type=1103 audit(1769215856.863:1088): pid=6091 uid=0 auid=500 ses=47 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:57.298093 sshd[6091]: Connection closed by 4.153.228.146 port 45678 Jan 24 00:50:57.299298 sshd-session[6087]: pam_unix(sshd:session): session closed for user core Jan 24 00:50:57.301000 audit[6087]: USER_END pid=6087 uid=0 auid=500 ses=47 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:57.304493 systemd-logind[1656]: Session 47 logged out. Waiting for processes to exit. Jan 24 00:50:57.306440 systemd[1]: sshd@45-65.109.167.77:22-4.153.228.146:45678.service: Deactivated successfully. Jan 24 00:50:57.309608 systemd[1]: session-47.scope: Deactivated successfully. Jan 24 00:50:57.313004 systemd-logind[1656]: Removed session 47. Jan 24 00:50:57.319103 kernel: audit: type=1106 audit(1769215857.301:1089): pid=6087 uid=0 auid=500 ses=47 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:57.319177 kernel: audit: type=1104 audit(1769215857.301:1090): pid=6087 uid=0 auid=500 ses=47 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:57.301000 audit[6087]: CRED_DISP pid=6087 uid=0 auid=500 ses=47 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:50:57.306000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@45-65.109.167.77:22-4.153.228.146:45678 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:50:59.618696 kubelet[2863]: E0124 00:50:59.618654 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:50:59.620235 kubelet[2863]: E0124 00:50:59.620197 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58c54c478-wd6fd" podUID="dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5" Jan 24 00:51:01.194615 containerd[1682]: time="2026-01-24T00:51:01.194526378Z" level=info msg="container event discarded" container=05ed700eca5f54aebe759ead0e7523876cc38ecc531f60cbc1f84f3d8c3a2b84 type=CONTAINER_CREATED_EVENT Jan 24 00:51:01.359452 containerd[1682]: time="2026-01-24T00:51:01.359384342Z" level=info msg="container event discarded" container=05ed700eca5f54aebe759ead0e7523876cc38ecc531f60cbc1f84f3d8c3a2b84 type=CONTAINER_STARTED_EVENT Jan 24 00:51:01.615719 kubelet[2863]: E0124 00:51:01.615340 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" podUID="7fb26181-9fdc-4f96-be2c-85fbaa5f21b7" Jan 24 00:51:02.448096 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:51:02.448289 kernel: audit: type=1130 audit(1769215862.444:1092): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@46-65.109.167.77:22-4.153.228.146:45682 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:51:02.444000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@46-65.109.167.77:22-4.153.228.146:45682 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:51:02.444162 systemd[1]: Started sshd@46-65.109.167.77:22-4.153.228.146:45682.service - OpenSSH per-connection server daemon (4.153.228.146:45682). Jan 24 00:51:02.558717 containerd[1682]: time="2026-01-24T00:51:02.558625441Z" level=info msg="container event discarded" container=da5909e15b2bf97632274e14e9528bbe67a8307c303dc203f262ecc1d70b15b7 type=CONTAINER_CREATED_EVENT Jan 24 00:51:02.559751 containerd[1682]: time="2026-01-24T00:51:02.559447181Z" level=info msg="container event discarded" container=da5909e15b2bf97632274e14e9528bbe67a8307c303dc203f262ecc1d70b15b7 type=CONTAINER_STARTED_EVENT Jan 24 00:51:03.136000 audit[6119]: USER_ACCT pid=6119 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:03.142306 sshd-session[6119]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:51:03.143582 sshd[6119]: Accepted publickey for core from 4.153.228.146 port 45682 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:51:03.146135 kernel: audit: type=1101 audit(1769215863.136:1093): pid=6119 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:03.140000 audit[6119]: CRED_ACQ pid=6119 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:03.151986 systemd-logind[1656]: New session 48 of user core. Jan 24 00:51:03.154077 kernel: audit: type=1103 audit(1769215863.140:1094): pid=6119 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:03.159213 systemd[1]: Started session-48.scope - Session 48 of User core. Jan 24 00:51:03.160079 kernel: audit: type=1006 audit(1769215863.140:1095): pid=6119 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=48 res=1 Jan 24 00:51:03.140000 audit[6119]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffdac9ee250 a2=3 a3=0 items=0 ppid=1 pid=6119 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=48 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:51:03.169872 kernel: audit: type=1300 audit(1769215863.140:1095): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffdac9ee250 a2=3 a3=0 items=0 ppid=1 pid=6119 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=48 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:51:03.140000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:51:03.180265 kernel: audit: type=1327 audit(1769215863.140:1095): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:51:03.180323 kernel: audit: type=1105 audit(1769215863.169:1096): pid=6119 uid=0 auid=500 ses=48 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:03.169000 audit[6119]: USER_START pid=6119 uid=0 auid=500 ses=48 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:03.173000 audit[6148]: CRED_ACQ pid=6148 uid=0 auid=500 ses=48 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:03.190074 kernel: audit: type=1103 audit(1769215863.173:1097): pid=6148 uid=0 auid=500 ses=48 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:03.617847 kubelet[2863]: E0124 00:51:03.617686 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:51:03.628110 sshd[6148]: Connection closed by 4.153.228.146 port 45682 Jan 24 00:51:03.628450 sshd-session[6119]: pam_unix(sshd:session): session closed for user core Jan 24 00:51:03.631000 audit[6119]: USER_END pid=6119 uid=0 auid=500 ses=48 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:03.634911 systemd[1]: sshd@46-65.109.167.77:22-4.153.228.146:45682.service: Deactivated successfully. Jan 24 00:51:03.637940 systemd[1]: session-48.scope: Deactivated successfully. Jan 24 00:51:03.644503 systemd-logind[1656]: Session 48 logged out. Waiting for processes to exit. Jan 24 00:51:03.645925 systemd-logind[1656]: Removed session 48. Jan 24 00:51:03.648226 kernel: audit: type=1106 audit(1769215863.631:1098): pid=6119 uid=0 auid=500 ses=48 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:03.631000 audit[6119]: CRED_DISP pid=6119 uid=0 auid=500 ses=48 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:03.661136 kernel: audit: type=1104 audit(1769215863.631:1099): pid=6119 uid=0 auid=500 ses=48 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:03.635000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@46-65.109.167.77:22-4.153.228.146:45682 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:51:04.919409 containerd[1682]: time="2026-01-24T00:51:04.919290326Z" level=info msg="container event discarded" container=320f1c06e6c1ca97896ac9667df0e7ab4461e9915a795a29f54090f52b21888e type=CONTAINER_CREATED_EVENT Jan 24 00:51:04.919409 containerd[1682]: time="2026-01-24T00:51:04.919410307Z" level=info msg="container event discarded" container=320f1c06e6c1ca97896ac9667df0e7ab4461e9915a795a29f54090f52b21888e type=CONTAINER_STARTED_EVENT Jan 24 00:51:04.950924 containerd[1682]: time="2026-01-24T00:51:04.950796006Z" level=info msg="container event discarded" container=39a57c4af03dd69383307c2b844145b451243c446ab3d30ae72b21167e7a2312 type=CONTAINER_CREATED_EVENT Jan 24 00:51:05.004586 containerd[1682]: time="2026-01-24T00:51:05.004473015Z" level=info msg="container event discarded" container=39a57c4af03dd69383307c2b844145b451243c446ab3d30ae72b21167e7a2312 type=CONTAINER_STARTED_EVENT Jan 24 00:51:05.622485 kubelet[2863]: E0124 00:51:05.622424 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" podUID="d83d59e3-6296-40d9-bb63-5a69b654ac0c" Jan 24 00:51:06.008520 containerd[1682]: time="2026-01-24T00:51:06.008404674Z" level=info msg="container event discarded" container=57ae81604a42f3f1e02214b2c1764cb8a6284b575340f332465b0e5f3e99ca03 type=CONTAINER_CREATED_EVENT Jan 24 00:51:06.008520 containerd[1682]: time="2026-01-24T00:51:06.008447254Z" level=info msg="container event discarded" container=57ae81604a42f3f1e02214b2c1764cb8a6284b575340f332465b0e5f3e99ca03 type=CONTAINER_STARTED_EVENT Jan 24 00:51:06.044668 containerd[1682]: time="2026-01-24T00:51:06.044599953Z" level=info msg="container event discarded" container=d945420ae355e786d6be4319ca8a034e86d3f86aff43fa4069a85961e9d9f147 type=CONTAINER_CREATED_EVENT Jan 24 00:51:06.044668 containerd[1682]: time="2026-01-24T00:51:06.044645793Z" level=info msg="container event discarded" container=d945420ae355e786d6be4319ca8a034e86d3f86aff43fa4069a85961e9d9f147 type=CONTAINER_STARTED_EVENT Jan 24 00:51:06.948731 containerd[1682]: time="2026-01-24T00:51:06.948652055Z" level=info msg="container event discarded" container=69e06ae8bc265e1fb1f3418504deaeb8c4f95ec1386d6bd13e92f3c9c3c75c9e type=CONTAINER_CREATED_EVENT Jan 24 00:51:06.948731 containerd[1682]: time="2026-01-24T00:51:06.948719685Z" level=info msg="container event discarded" container=69e06ae8bc265e1fb1f3418504deaeb8c4f95ec1386d6bd13e92f3c9c3c75c9e type=CONTAINER_STARTED_EVENT Jan 24 00:51:07.075000 containerd[1682]: time="2026-01-24T00:51:07.074939282Z" level=info msg="container event discarded" container=c12906afa58629820f0afacb4581293e9143663acd69829fe97374dfe4ffcad1 type=CONTAINER_CREATED_EVENT Jan 24 00:51:07.075000 containerd[1682]: time="2026-01-24T00:51:07.074991592Z" level=info msg="container event discarded" container=c12906afa58629820f0afacb4581293e9143663acd69829fe97374dfe4ffcad1 type=CONTAINER_STARTED_EVENT Jan 24 00:51:07.097163 containerd[1682]: time="2026-01-24T00:51:07.097107452Z" level=info msg="container event discarded" container=5d689b5a0b7463a5791ec5d849348096d8c98ab6487253dc96e6bb3d0a30c042 type=CONTAINER_CREATED_EVENT Jan 24 00:51:07.150433 containerd[1682]: time="2026-01-24T00:51:07.150326850Z" level=info msg="container event discarded" container=d8fecce5d72eb368010025fa119cf35a34d61afa107134a7c3229da4268e400a type=CONTAINER_CREATED_EVENT Jan 24 00:51:07.150433 containerd[1682]: time="2026-01-24T00:51:07.150369840Z" level=info msg="container event discarded" container=d8fecce5d72eb368010025fa119cf35a34d61afa107134a7c3229da4268e400a type=CONTAINER_STARTED_EVENT Jan 24 00:51:07.167645 containerd[1682]: time="2026-01-24T00:51:07.167521480Z" level=info msg="container event discarded" container=5d689b5a0b7463a5791ec5d849348096d8c98ab6487253dc96e6bb3d0a30c042 type=CONTAINER_STARTED_EVENT Jan 24 00:51:07.937098 containerd[1682]: time="2026-01-24T00:51:07.937012249Z" level=info msg="container event discarded" container=29c73deb7a5e94e73efade8c4fe4fbb54e87f88d231e00c3735cd2e1b2dc0cae type=CONTAINER_CREATED_EVENT Jan 24 00:51:07.937098 containerd[1682]: time="2026-01-24T00:51:07.937075279Z" level=info msg="container event discarded" container=29c73deb7a5e94e73efade8c4fe4fbb54e87f88d231e00c3735cd2e1b2dc0cae type=CONTAINER_STARTED_EVENT Jan 24 00:51:08.780796 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:51:08.780939 kernel: audit: type=1130 audit(1769215868.765:1101): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@47-65.109.167.77:22-4.153.228.146:53140 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:51:08.765000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@47-65.109.167.77:22-4.153.228.146:53140 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:51:08.765339 systemd[1]: Started sshd@47-65.109.167.77:22-4.153.228.146:53140.service - OpenSSH per-connection server daemon (4.153.228.146:53140). Jan 24 00:51:09.456000 audit[6166]: USER_ACCT pid=6166 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:09.460008 sshd-session[6166]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:51:09.462091 sshd[6166]: Accepted publickey for core from 4.153.228.146 port 53140 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:51:09.465109 kernel: audit: type=1101 audit(1769215869.456:1102): pid=6166 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:09.465215 kernel: audit: type=1103 audit(1769215869.457:1103): pid=6166 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:09.457000 audit[6166]: CRED_ACQ pid=6166 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:09.469924 kernel: audit: type=1006 audit(1769215869.458:1104): pid=6166 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=49 res=1 Jan 24 00:51:09.458000 audit[6166]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffcea668e00 a2=3 a3=0 items=0 ppid=1 pid=6166 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=49 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:51:09.480589 systemd-logind[1656]: New session 49 of user core. Jan 24 00:51:09.458000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:51:09.486575 kernel: audit: type=1300 audit(1769215869.458:1104): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffcea668e00 a2=3 a3=0 items=0 ppid=1 pid=6166 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=49 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:51:09.486672 kernel: audit: type=1327 audit(1769215869.458:1104): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:51:09.490677 systemd[1]: Started session-49.scope - Session 49 of User core. Jan 24 00:51:09.498000 audit[6166]: USER_START pid=6166 uid=0 auid=500 ses=49 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:09.505000 audit[6170]: CRED_ACQ pid=6170 uid=0 auid=500 ses=49 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:09.507024 kernel: audit: type=1105 audit(1769215869.498:1105): pid=6166 uid=0 auid=500 ses=49 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:09.507116 kernel: audit: type=1103 audit(1769215869.505:1106): pid=6170 uid=0 auid=500 ses=49 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:09.904134 sshd[6170]: Connection closed by 4.153.228.146 port 53140 Jan 24 00:51:09.905633 sshd-session[6166]: pam_unix(sshd:session): session closed for user core Jan 24 00:51:09.908000 audit[6166]: USER_END pid=6166 uid=0 auid=500 ses=49 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:09.911542 systemd[1]: sshd@47-65.109.167.77:22-4.153.228.146:53140.service: Deactivated successfully. Jan 24 00:51:09.914048 systemd[1]: session-49.scope: Deactivated successfully. Jan 24 00:51:09.918051 systemd-logind[1656]: Session 49 logged out. Waiting for processes to exit. Jan 24 00:51:09.919129 systemd-logind[1656]: Removed session 49. Jan 24 00:51:09.926158 kernel: audit: type=1106 audit(1769215869.908:1107): pid=6166 uid=0 auid=500 ses=49 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:09.908000 audit[6166]: CRED_DISP pid=6166 uid=0 auid=500 ses=49 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:09.912000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@47-65.109.167.77:22-4.153.228.146:53140 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:51:09.939105 kernel: audit: type=1104 audit(1769215869.908:1108): pid=6166 uid=0 auid=500 ses=49 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:10.618235 kubelet[2863]: E0124 00:51:10.618166 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qhldj" podUID="d0cc0ca8-4b85-478b-b9c2-c61b42d93c89" Jan 24 00:51:11.617761 kubelet[2863]: E0124 00:51:11.617292 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:51:12.618032 kubelet[2863]: E0124 00:51:12.617963 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58c54c478-wd6fd" podUID="dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5" Jan 24 00:51:15.039531 systemd[1]: Started sshd@48-65.109.167.77:22-4.153.228.146:40764.service - OpenSSH per-connection server daemon (4.153.228.146:40764). Jan 24 00:51:15.039000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@48-65.109.167.77:22-4.153.228.146:40764 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:51:15.044104 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:51:15.044188 kernel: audit: type=1130 audit(1769215875.039:1110): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@48-65.109.167.77:22-4.153.228.146:40764 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:51:15.616444 kubelet[2863]: E0124 00:51:15.616395 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" podUID="7fb26181-9fdc-4f96-be2c-85fbaa5f21b7" Jan 24 00:51:15.727000 audit[6182]: USER_ACCT pid=6182 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:15.730177 sshd-session[6182]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:51:15.731922 sshd[6182]: Accepted publickey for core from 4.153.228.146 port 40764 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:51:15.727000 audit[6182]: CRED_ACQ pid=6182 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:15.735362 kernel: audit: type=1101 audit(1769215875.727:1111): pid=6182 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:15.735414 kernel: audit: type=1103 audit(1769215875.727:1112): pid=6182 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:15.740679 kernel: audit: type=1006 audit(1769215875.727:1113): pid=6182 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=50 res=1 Jan 24 00:51:15.727000 audit[6182]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffcaab8b030 a2=3 a3=0 items=0 ppid=1 pid=6182 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=50 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:51:15.744800 kernel: audit: type=1300 audit(1769215875.727:1113): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffcaab8b030 a2=3 a3=0 items=0 ppid=1 pid=6182 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=50 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:51:15.745276 systemd-logind[1656]: New session 50 of user core. Jan 24 00:51:15.727000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:51:15.750572 kernel: audit: type=1327 audit(1769215875.727:1113): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:51:15.754220 systemd[1]: Started session-50.scope - Session 50 of User core. Jan 24 00:51:15.760000 audit[6182]: USER_START pid=6182 uid=0 auid=500 ses=50 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:15.761000 audit[6186]: CRED_ACQ pid=6186 uid=0 auid=500 ses=50 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:15.768164 kernel: audit: type=1105 audit(1769215875.760:1114): pid=6182 uid=0 auid=500 ses=50 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:15.768215 kernel: audit: type=1103 audit(1769215875.761:1115): pid=6186 uid=0 auid=500 ses=50 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:16.200444 sshd[6186]: Connection closed by 4.153.228.146 port 40764 Jan 24 00:51:16.201447 sshd-session[6182]: pam_unix(sshd:session): session closed for user core Jan 24 00:51:16.204000 audit[6182]: USER_END pid=6182 uid=0 auid=500 ses=50 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:16.212741 systemd-logind[1656]: Session 50 logged out. Waiting for processes to exit. Jan 24 00:51:16.215624 systemd[1]: sshd@48-65.109.167.77:22-4.153.228.146:40764.service: Deactivated successfully. Jan 24 00:51:16.205000 audit[6182]: CRED_DISP pid=6182 uid=0 auid=500 ses=50 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:16.225565 kernel: audit: type=1106 audit(1769215876.204:1116): pid=6182 uid=0 auid=500 ses=50 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:16.225659 kernel: audit: type=1104 audit(1769215876.205:1117): pid=6182 uid=0 auid=500 ses=50 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:16.225158 systemd[1]: session-50.scope: Deactivated successfully. Jan 24 00:51:16.231317 systemd-logind[1656]: Removed session 50. Jan 24 00:51:16.216000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@48-65.109.167.77:22-4.153.228.146:40764 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:51:16.616360 kubelet[2863]: E0124 00:51:16.615848 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" podUID="d83d59e3-6296-40d9-bb63-5a69b654ac0c" Jan 24 00:51:17.617944 kubelet[2863]: E0124 00:51:17.617851 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:51:21.357035 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:51:21.357201 kernel: audit: type=1130 audit(1769215881.344:1119): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@49-65.109.167.77:22-4.153.228.146:40780 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:51:21.344000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@49-65.109.167.77:22-4.153.228.146:40780 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:51:21.343552 systemd[1]: Started sshd@49-65.109.167.77:22-4.153.228.146:40780.service - OpenSSH per-connection server daemon (4.153.228.146:40780). Jan 24 00:51:22.043000 audit[6198]: USER_ACCT pid=6198 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:22.046199 sshd-session[6198]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:51:22.050996 sshd[6198]: Accepted publickey for core from 4.153.228.146 port 40780 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:51:22.051221 kernel: audit: type=1101 audit(1769215882.043:1120): pid=6198 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:22.044000 audit[6198]: CRED_ACQ pid=6198 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:22.058590 systemd-logind[1656]: New session 51 of user core. Jan 24 00:51:22.063234 kernel: audit: type=1103 audit(1769215882.044:1121): pid=6198 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:22.063289 kernel: audit: type=1006 audit(1769215882.044:1122): pid=6198 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=51 res=1 Jan 24 00:51:22.067180 kernel: audit: type=1300 audit(1769215882.044:1122): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe38281460 a2=3 a3=0 items=0 ppid=1 pid=6198 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=51 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:51:22.044000 audit[6198]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe38281460 a2=3 a3=0 items=0 ppid=1 pid=6198 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=51 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:51:22.064227 systemd[1]: Started session-51.scope - Session 51 of User core. Jan 24 00:51:22.044000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:51:22.071764 kernel: audit: type=1327 audit(1769215882.044:1122): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:51:22.078359 kernel: audit: type=1105 audit(1769215882.073:1123): pid=6198 uid=0 auid=500 ses=51 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:22.073000 audit[6198]: USER_START pid=6198 uid=0 auid=500 ses=51 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:22.082000 audit[6202]: CRED_ACQ pid=6202 uid=0 auid=500 ses=51 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:22.088098 kernel: audit: type=1103 audit(1769215882.082:1124): pid=6202 uid=0 auid=500 ses=51 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:22.472143 sshd[6202]: Connection closed by 4.153.228.146 port 40780 Jan 24 00:51:22.472677 sshd-session[6198]: pam_unix(sshd:session): session closed for user core Jan 24 00:51:22.475000 audit[6198]: USER_END pid=6198 uid=0 auid=500 ses=51 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:22.484933 systemd[1]: sshd@49-65.109.167.77:22-4.153.228.146:40780.service: Deactivated successfully. Jan 24 00:51:22.485395 kernel: audit: type=1106 audit(1769215882.475:1125): pid=6198 uid=0 auid=500 ses=51 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:22.475000 audit[6198]: CRED_DISP pid=6198 uid=0 auid=500 ses=51 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:22.488856 systemd[1]: session-51.scope: Deactivated successfully. Jan 24 00:51:22.493181 systemd-logind[1656]: Session 51 logged out. Waiting for processes to exit. Jan 24 00:51:22.494319 systemd-logind[1656]: Removed session 51. Jan 24 00:51:22.485000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@49-65.109.167.77:22-4.153.228.146:40780 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:51:22.501098 kernel: audit: type=1104 audit(1769215882.475:1126): pid=6198 uid=0 auid=500 ses=51 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:22.616482 kubelet[2863]: E0124 00:51:22.616429 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qhldj" podUID="d0cc0ca8-4b85-478b-b9c2-c61b42d93c89" Jan 24 00:51:22.617198 kubelet[2863]: E0124 00:51:22.617009 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:51:23.620092 kubelet[2863]: E0124 00:51:23.620017 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58c54c478-wd6fd" podUID="dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5" Jan 24 00:51:27.613229 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:51:27.613370 kernel: audit: type=1130 audit(1769215887.609:1128): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@50-65.109.167.77:22-4.153.228.146:45080 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:51:27.609000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@50-65.109.167.77:22-4.153.228.146:45080 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:51:27.609769 systemd[1]: Started sshd@50-65.109.167.77:22-4.153.228.146:45080.service - OpenSSH per-connection server daemon (4.153.228.146:45080). Jan 24 00:51:27.628584 kubelet[2863]: E0124 00:51:27.628386 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" podUID="7fb26181-9fdc-4f96-be2c-85fbaa5f21b7" Jan 24 00:51:28.300000 audit[6216]: USER_ACCT pid=6216 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:28.301235 sshd[6216]: Accepted publickey for core from 4.153.228.146 port 45080 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:51:28.307103 kernel: audit: type=1101 audit(1769215888.300:1129): pid=6216 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:28.309308 sshd-session[6216]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:51:28.317107 systemd-logind[1656]: New session 52 of user core. Jan 24 00:51:28.308000 audit[6216]: CRED_ACQ pid=6216 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:28.322793 systemd[1]: Started session-52.scope - Session 52 of User core. Jan 24 00:51:28.323082 kernel: audit: type=1103 audit(1769215888.308:1130): pid=6216 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:28.334084 kernel: audit: type=1006 audit(1769215888.308:1131): pid=6216 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=52 res=1 Jan 24 00:51:28.308000 audit[6216]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff303b21a0 a2=3 a3=0 items=0 ppid=1 pid=6216 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=52 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:51:28.343085 kernel: audit: type=1300 audit(1769215888.308:1131): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff303b21a0 a2=3 a3=0 items=0 ppid=1 pid=6216 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=52 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:51:28.308000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:51:28.348085 kernel: audit: type=1327 audit(1769215888.308:1131): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:51:28.327000 audit[6216]: USER_START pid=6216 uid=0 auid=500 ses=52 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:28.334000 audit[6220]: CRED_ACQ pid=6220 uid=0 auid=500 ses=52 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:28.357286 kernel: audit: type=1105 audit(1769215888.327:1132): pid=6216 uid=0 auid=500 ses=52 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:28.357320 kernel: audit: type=1103 audit(1769215888.334:1133): pid=6220 uid=0 auid=500 ses=52 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:28.782838 sshd[6220]: Connection closed by 4.153.228.146 port 45080 Jan 24 00:51:28.784369 sshd-session[6216]: pam_unix(sshd:session): session closed for user core Jan 24 00:51:28.788000 audit[6216]: USER_END pid=6216 uid=0 auid=500 ses=52 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:28.807325 kernel: audit: type=1106 audit(1769215888.788:1134): pid=6216 uid=0 auid=500 ses=52 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:28.788000 audit[6216]: CRED_DISP pid=6216 uid=0 auid=500 ses=52 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:28.814453 systemd-logind[1656]: Session 52 logged out. Waiting for processes to exit. Jan 24 00:51:28.816663 systemd[1]: sshd@50-65.109.167.77:22-4.153.228.146:45080.service: Deactivated successfully. Jan 24 00:51:28.818104 kernel: audit: type=1104 audit(1769215888.788:1135): pid=6216 uid=0 auid=500 ses=52 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:28.818000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@50-65.109.167.77:22-4.153.228.146:45080 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:51:28.821941 systemd[1]: session-52.scope: Deactivated successfully. Jan 24 00:51:28.829156 systemd-logind[1656]: Removed session 52. Jan 24 00:51:30.617914 kubelet[2863]: E0124 00:51:30.617823 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" podUID="d83d59e3-6296-40d9-bb63-5a69b654ac0c" Jan 24 00:51:30.618901 kubelet[2863]: E0124 00:51:30.618288 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:51:33.620029 kubelet[2863]: E0124 00:51:33.619956 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qhldj" podUID="d0cc0ca8-4b85-478b-b9c2-c61b42d93c89" Jan 24 00:51:33.948477 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:51:33.948689 kernel: audit: type=1130 audit(1769215893.929:1137): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@51-65.109.167.77:22-4.153.228.146:45084 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:51:33.929000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@51-65.109.167.77:22-4.153.228.146:45084 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:51:33.928609 systemd[1]: Started sshd@51-65.109.167.77:22-4.153.228.146:45084.service - OpenSSH per-connection server daemon (4.153.228.146:45084). Jan 24 00:51:34.617903 kubelet[2863]: E0124 00:51:34.617834 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:51:34.646000 audit[6267]: USER_ACCT pid=6267 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:34.651586 sshd-session[6267]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:51:34.661353 kernel: audit: type=1101 audit(1769215894.646:1138): pid=6267 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:34.661404 sshd[6267]: Accepted publickey for core from 4.153.228.146 port 45084 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:51:34.647000 audit[6267]: CRED_ACQ pid=6267 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:34.677088 kernel: audit: type=1103 audit(1769215894.647:1139): pid=6267 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:34.677162 kernel: audit: type=1006 audit(1769215894.647:1140): pid=6267 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=53 res=1 Jan 24 00:51:34.647000 audit[6267]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc33896290 a2=3 a3=0 items=0 ppid=1 pid=6267 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=53 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:51:34.685231 kernel: audit: type=1300 audit(1769215894.647:1140): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc33896290 a2=3 a3=0 items=0 ppid=1 pid=6267 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=53 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:51:34.647000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:51:34.689135 kernel: audit: type=1327 audit(1769215894.647:1140): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:51:34.690713 systemd-logind[1656]: New session 53 of user core. Jan 24 00:51:34.697198 systemd[1]: Started session-53.scope - Session 53 of User core. Jan 24 00:51:34.707000 audit[6267]: USER_START pid=6267 uid=0 auid=500 ses=53 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:34.717087 kernel: audit: type=1105 audit(1769215894.707:1141): pid=6267 uid=0 auid=500 ses=53 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:34.711000 audit[6271]: CRED_ACQ pid=6271 uid=0 auid=500 ses=53 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:34.727073 kernel: audit: type=1103 audit(1769215894.711:1142): pid=6271 uid=0 auid=500 ses=53 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:35.129094 sshd[6271]: Connection closed by 4.153.228.146 port 45084 Jan 24 00:51:35.129231 sshd-session[6267]: pam_unix(sshd:session): session closed for user core Jan 24 00:51:35.131000 audit[6267]: USER_END pid=6267 uid=0 auid=500 ses=53 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:35.134015 systemd-logind[1656]: Session 53 logged out. Waiting for processes to exit. Jan 24 00:51:35.135046 systemd[1]: sshd@51-65.109.167.77:22-4.153.228.146:45084.service: Deactivated successfully. Jan 24 00:51:35.137695 systemd[1]: session-53.scope: Deactivated successfully. Jan 24 00:51:35.140111 kernel: audit: type=1106 audit(1769215895.131:1143): pid=6267 uid=0 auid=500 ses=53 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:35.131000 audit[6267]: CRED_DISP pid=6267 uid=0 auid=500 ses=53 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:35.141470 systemd-logind[1656]: Removed session 53. Jan 24 00:51:35.134000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@51-65.109.167.77:22-4.153.228.146:45084 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:51:35.148089 kernel: audit: type=1104 audit(1769215895.131:1144): pid=6267 uid=0 auid=500 ses=53 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:35.617346 containerd[1682]: time="2026-01-24T00:51:35.616967835Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 24 00:51:36.077444 containerd[1682]: time="2026-01-24T00:51:36.077301253Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 24 00:51:36.079075 containerd[1682]: time="2026-01-24T00:51:36.078691933Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 24 00:51:36.079204 containerd[1682]: time="2026-01-24T00:51:36.078822472Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 24 00:51:36.080194 kubelet[2863]: E0124 00:51:36.080149 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 24 00:51:36.080486 kubelet[2863]: E0124 00:51:36.080203 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 24 00:51:36.080486 kubelet[2863]: E0124 00:51:36.080298 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:c13ad1f9d9cd499a81c814dc308eb491,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dlnt8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-58c54c478-wd6fd_calico-system(dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 24 00:51:36.082318 containerd[1682]: time="2026-01-24T00:51:36.082288362Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 24 00:51:36.513454 containerd[1682]: time="2026-01-24T00:51:36.512923495Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 24 00:51:36.514233 containerd[1682]: time="2026-01-24T00:51:36.514194895Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 24 00:51:36.514484 containerd[1682]: time="2026-01-24T00:51:36.514263225Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 24 00:51:36.515098 kubelet[2863]: E0124 00:51:36.514783 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 24 00:51:36.515098 kubelet[2863]: E0124 00:51:36.514869 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 24 00:51:36.515098 kubelet[2863]: E0124 00:51:36.515014 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dlnt8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-58c54c478-wd6fd_calico-system(dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 24 00:51:36.516593 kubelet[2863]: E0124 00:51:36.516542 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58c54c478-wd6fd" podUID="dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5" Jan 24 00:51:40.273040 systemd[1]: Started sshd@52-65.109.167.77:22-4.153.228.146:47822.service - OpenSSH per-connection server daemon (4.153.228.146:47822). Jan 24 00:51:40.286102 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:51:40.286234 kernel: audit: type=1130 audit(1769215900.274:1146): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@52-65.109.167.77:22-4.153.228.146:47822 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:51:40.274000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@52-65.109.167.77:22-4.153.228.146:47822 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:51:40.616999 kubelet[2863]: E0124 00:51:40.616907 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" podUID="7fb26181-9fdc-4f96-be2c-85fbaa5f21b7" Jan 24 00:51:40.986322 kernel: audit: type=1101 audit(1769215900.970:1147): pid=6282 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:40.970000 audit[6282]: USER_ACCT pid=6282 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:40.986687 sshd[6282]: Accepted publickey for core from 4.153.228.146 port 47822 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:51:40.993795 sshd-session[6282]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:51:40.990000 audit[6282]: CRED_ACQ pid=6282 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:41.011485 kernel: audit: type=1103 audit(1769215900.990:1148): pid=6282 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:41.018527 systemd-logind[1656]: New session 54 of user core. Jan 24 00:51:41.020095 kernel: audit: type=1006 audit(1769215900.991:1149): pid=6282 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=54 res=1 Jan 24 00:51:40.991000 audit[6282]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff4b155f90 a2=3 a3=0 items=0 ppid=1 pid=6282 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=54 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:51:41.034155 kernel: audit: type=1300 audit(1769215900.991:1149): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff4b155f90 a2=3 a3=0 items=0 ppid=1 pid=6282 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=54 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:51:41.034246 kernel: audit: type=1327 audit(1769215900.991:1149): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:51:40.991000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:51:41.036470 systemd[1]: Started session-54.scope - Session 54 of User core. Jan 24 00:51:41.041000 audit[6282]: USER_START pid=6282 uid=0 auid=500 ses=54 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:41.052086 kernel: audit: type=1105 audit(1769215901.041:1150): pid=6282 uid=0 auid=500 ses=54 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:41.052151 kernel: audit: type=1103 audit(1769215901.049:1151): pid=6286 uid=0 auid=500 ses=54 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:41.049000 audit[6286]: CRED_ACQ pid=6286 uid=0 auid=500 ses=54 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:41.402212 sshd[6286]: Connection closed by 4.153.228.146 port 47822 Jan 24 00:51:41.402547 sshd-session[6282]: pam_unix(sshd:session): session closed for user core Jan 24 00:51:41.403000 audit[6282]: USER_END pid=6282 uid=0 auid=500 ses=54 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:41.406743 systemd-logind[1656]: Session 54 logged out. Waiting for processes to exit. Jan 24 00:51:41.408812 systemd[1]: sshd@52-65.109.167.77:22-4.153.228.146:47822.service: Deactivated successfully. Jan 24 00:51:41.410799 systemd[1]: session-54.scope: Deactivated successfully. Jan 24 00:51:41.413146 kernel: audit: type=1106 audit(1769215901.403:1152): pid=6282 uid=0 auid=500 ses=54 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:41.404000 audit[6282]: CRED_DISP pid=6282 uid=0 auid=500 ses=54 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:41.415197 systemd-logind[1656]: Removed session 54. Jan 24 00:51:41.419090 kernel: audit: type=1104 audit(1769215901.404:1153): pid=6282 uid=0 auid=500 ses=54 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:41.409000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@52-65.109.167.77:22-4.153.228.146:47822 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:51:42.616989 kubelet[2863]: E0124 00:51:42.616738 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" podUID="d83d59e3-6296-40d9-bb63-5a69b654ac0c" Jan 24 00:51:44.618995 kubelet[2863]: E0124 00:51:44.618852 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:51:46.548293 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:51:46.548412 kernel: audit: type=1130 audit(1769215906.539:1155): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@53-65.109.167.77:22-4.153.228.146:41204 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:51:46.539000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@53-65.109.167.77:22-4.153.228.146:41204 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:51:46.539393 systemd[1]: Started sshd@53-65.109.167.77:22-4.153.228.146:41204.service - OpenSSH per-connection server daemon (4.153.228.146:41204). Jan 24 00:51:47.230000 audit[6300]: USER_ACCT pid=6300 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:47.239158 sshd[6300]: Accepted publickey for core from 4.153.228.146 port 41204 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:51:47.243464 sshd-session[6300]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:51:47.241000 audit[6300]: CRED_ACQ pid=6300 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:47.248745 kernel: audit: type=1101 audit(1769215907.230:1156): pid=6300 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:47.248846 kernel: audit: type=1103 audit(1769215907.241:1157): pid=6300 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:47.264209 kernel: audit: type=1006 audit(1769215907.241:1158): pid=6300 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=55 res=1 Jan 24 00:51:47.261392 systemd-logind[1656]: New session 55 of user core. Jan 24 00:51:47.272301 systemd[1]: Started session-55.scope - Session 55 of User core. Jan 24 00:51:47.241000 audit[6300]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe41162300 a2=3 a3=0 items=0 ppid=1 pid=6300 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=55 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:51:47.241000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:51:47.290606 kernel: audit: type=1300 audit(1769215907.241:1158): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe41162300 a2=3 a3=0 items=0 ppid=1 pid=6300 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=55 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:51:47.290669 kernel: audit: type=1327 audit(1769215907.241:1158): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:51:47.286000 audit[6300]: USER_START pid=6300 uid=0 auid=500 ses=55 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:47.298726 kernel: audit: type=1105 audit(1769215907.286:1159): pid=6300 uid=0 auid=500 ses=55 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:47.291000 audit[6304]: CRED_ACQ pid=6304 uid=0 auid=500 ses=55 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:47.312999 kernel: audit: type=1103 audit(1769215907.291:1160): pid=6304 uid=0 auid=500 ses=55 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:47.617975 containerd[1682]: time="2026-01-24T00:51:47.617709249Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 24 00:51:47.680515 sshd[6304]: Connection closed by 4.153.228.146 port 41204 Jan 24 00:51:47.681250 sshd-session[6300]: pam_unix(sshd:session): session closed for user core Jan 24 00:51:47.683000 audit[6300]: USER_END pid=6300 uid=0 auid=500 ses=55 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:47.685354 systemd-logind[1656]: Session 55 logged out. Waiting for processes to exit. Jan 24 00:51:47.687002 systemd[1]: sshd@53-65.109.167.77:22-4.153.228.146:41204.service: Deactivated successfully. Jan 24 00:51:47.689395 systemd[1]: session-55.scope: Deactivated successfully. Jan 24 00:51:47.691682 kernel: audit: type=1106 audit(1769215907.683:1161): pid=6300 uid=0 auid=500 ses=55 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:47.683000 audit[6300]: CRED_DISP pid=6300 uid=0 auid=500 ses=55 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:47.691919 systemd-logind[1656]: Removed session 55. Jan 24 00:51:47.687000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@53-65.109.167.77:22-4.153.228.146:41204 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:51:47.698089 kernel: audit: type=1104 audit(1769215907.683:1162): pid=6300 uid=0 auid=500 ses=55 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:48.049479 containerd[1682]: time="2026-01-24T00:51:48.049356671Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 24 00:51:48.050672 containerd[1682]: time="2026-01-24T00:51:48.050624891Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 24 00:51:48.050817 containerd[1682]: time="2026-01-24T00:51:48.050732061Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 24 00:51:48.050987 kubelet[2863]: E0124 00:51:48.050891 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 24 00:51:48.050987 kubelet[2863]: E0124 00:51:48.050980 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 24 00:51:48.051537 kubelet[2863]: E0124 00:51:48.051109 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f5fsn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-58fdcd774c-w2drb_calico-system(f546e732-cf0b-44c7-9678-ae1cb31a23a4): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 24 00:51:48.052413 kubelet[2863]: E0124 00:51:48.052380 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:51:48.616782 containerd[1682]: time="2026-01-24T00:51:48.616637017Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 24 00:51:49.065568 containerd[1682]: time="2026-01-24T00:51:49.065444833Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 24 00:51:49.067037 containerd[1682]: time="2026-01-24T00:51:49.066823213Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 24 00:51:49.067037 containerd[1682]: time="2026-01-24T00:51:49.066882063Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 24 00:51:49.067102 kubelet[2863]: E0124 00:51:49.067020 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 24 00:51:49.067102 kubelet[2863]: E0124 00:51:49.067071 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 24 00:51:49.067321 kubelet[2863]: E0124 00:51:49.067175 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w7bh8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-qhldj_calico-system(d0cc0ca8-4b85-478b-b9c2-c61b42d93c89): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 24 00:51:49.068474 kubelet[2863]: E0124 00:51:49.068447 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qhldj" podUID="d0cc0ca8-4b85-478b-b9c2-c61b42d93c89" Jan 24 00:51:50.618519 kubelet[2863]: E0124 00:51:50.618448 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58c54c478-wd6fd" podUID="dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5" Jan 24 00:51:51.618528 containerd[1682]: time="2026-01-24T00:51:51.618434238Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 24 00:51:52.042692 containerd[1682]: time="2026-01-24T00:51:52.042222815Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 24 00:51:52.043942 containerd[1682]: time="2026-01-24T00:51:52.043792244Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 24 00:51:52.043942 containerd[1682]: time="2026-01-24T00:51:52.043893004Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 24 00:51:52.044246 kubelet[2863]: E0124 00:51:52.044140 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 24 00:51:52.044246 kubelet[2863]: E0124 00:51:52.044222 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 24 00:51:52.044832 kubelet[2863]: E0124 00:51:52.044643 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-llmb2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-7779cd58b4-jsxfd_calico-apiserver(7fb26181-9fdc-4f96-be2c-85fbaa5f21b7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 24 00:51:52.045851 kubelet[2863]: E0124 00:51:52.045806 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" podUID="7fb26181-9fdc-4f96-be2c-85fbaa5f21b7" Jan 24 00:51:52.814000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@54-65.109.167.77:22-4.153.228.146:41208 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:51:52.814406 systemd[1]: Started sshd@54-65.109.167.77:22-4.153.228.146:41208.service - OpenSSH per-connection server daemon (4.153.228.146:41208). Jan 24 00:51:52.822321 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:51:52.822405 kernel: audit: type=1130 audit(1769215912.814:1164): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@54-65.109.167.77:22-4.153.228.146:41208 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:51:53.468000 audit[6315]: USER_ACCT pid=6315 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:53.469954 sshd[6315]: Accepted publickey for core from 4.153.228.146 port 41208 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:51:53.472954 sshd-session[6315]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:51:53.484366 kernel: audit: type=1101 audit(1769215913.468:1165): pid=6315 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:53.468000 audit[6315]: CRED_ACQ pid=6315 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:53.492487 systemd-logind[1656]: New session 56 of user core. Jan 24 00:51:53.496723 kernel: audit: type=1103 audit(1769215913.468:1166): pid=6315 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:53.496805 kernel: audit: type=1006 audit(1769215913.468:1167): pid=6315 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=56 res=1 Jan 24 00:51:53.468000 audit[6315]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff034e0250 a2=3 a3=0 items=0 ppid=1 pid=6315 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=56 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:51:53.503513 kernel: audit: type=1300 audit(1769215913.468:1167): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff034e0250 a2=3 a3=0 items=0 ppid=1 pid=6315 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=56 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:51:53.468000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:51:53.510297 systemd[1]: Started session-56.scope - Session 56 of User core. Jan 24 00:51:53.511198 kernel: audit: type=1327 audit(1769215913.468:1167): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:51:53.517000 audit[6315]: USER_START pid=6315 uid=0 auid=500 ses=56 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:53.526000 audit[6319]: CRED_ACQ pid=6319 uid=0 auid=500 ses=56 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:53.530550 kernel: audit: type=1105 audit(1769215913.517:1168): pid=6315 uid=0 auid=500 ses=56 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:53.530656 kernel: audit: type=1103 audit(1769215913.526:1169): pid=6319 uid=0 auid=500 ses=56 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:53.619525 containerd[1682]: time="2026-01-24T00:51:53.619403799Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 24 00:51:53.921035 sshd[6319]: Connection closed by 4.153.228.146 port 41208 Jan 24 00:51:53.922503 sshd-session[6315]: pam_unix(sshd:session): session closed for user core Jan 24 00:51:53.943224 kernel: audit: type=1106 audit(1769215913.924:1170): pid=6315 uid=0 auid=500 ses=56 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:53.924000 audit[6315]: USER_END pid=6315 uid=0 auid=500 ses=56 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:53.947326 systemd[1]: sshd@54-65.109.167.77:22-4.153.228.146:41208.service: Deactivated successfully. Jan 24 00:51:53.960244 kernel: audit: type=1104 audit(1769215913.924:1171): pid=6315 uid=0 auid=500 ses=56 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:53.924000 audit[6315]: CRED_DISP pid=6315 uid=0 auid=500 ses=56 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:53.952229 systemd[1]: session-56.scope: Deactivated successfully. Jan 24 00:51:53.959582 systemd-logind[1656]: Session 56 logged out. Waiting for processes to exit. Jan 24 00:51:53.948000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@54-65.109.167.77:22-4.153.228.146:41208 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:51:53.960900 systemd-logind[1656]: Removed session 56. Jan 24 00:51:54.056326 containerd[1682]: time="2026-01-24T00:51:54.056254360Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 24 00:51:54.057585 containerd[1682]: time="2026-01-24T00:51:54.057542900Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 24 00:51:54.057674 containerd[1682]: time="2026-01-24T00:51:54.057584500Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 24 00:51:54.057842 kubelet[2863]: E0124 00:51:54.057804 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 24 00:51:54.057842 kubelet[2863]: E0124 00:51:54.057843 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 24 00:51:54.058395 kubelet[2863]: E0124 00:51:54.057962 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tm5dq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-7779cd58b4-xp2rb_calico-apiserver(d83d59e3-6296-40d9-bb63-5a69b654ac0c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 24 00:51:54.059260 kubelet[2863]: E0124 00:51:54.059232 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" podUID="d83d59e3-6296-40d9-bb63-5a69b654ac0c" Jan 24 00:51:55.362000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@55-65.109.167.77:22-92.118.39.87:50974 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:51:55.362715 systemd[1]: Started sshd@55-65.109.167.77:22-92.118.39.87:50974.service - OpenSSH per-connection server daemon (92.118.39.87:50974). Jan 24 00:51:55.427853 sshd[6331]: Connection closed by 92.118.39.87 port 50974 Jan 24 00:51:55.429143 systemd[1]: sshd@55-65.109.167.77:22-92.118.39.87:50974.service: Deactivated successfully. Jan 24 00:51:55.429000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@55-65.109.167.77:22-92.118.39.87:50974 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:51:58.619117 kubelet[2863]: E0124 00:51:58.619035 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:51:59.074952 kernel: kauditd_printk_skb: 3 callbacks suppressed Jan 24 00:51:59.075108 kernel: audit: type=1130 audit(1769215919.061:1175): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@56-65.109.167.77:22-4.153.228.146:42424 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:51:59.061000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@56-65.109.167.77:22-4.153.228.146:42424 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:51:59.061823 systemd[1]: Started sshd@56-65.109.167.77:22-4.153.228.146:42424.service - OpenSSH per-connection server daemon (4.153.228.146:42424). Jan 24 00:51:59.618926 kubelet[2863]: E0124 00:51:59.617707 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qhldj" podUID="d0cc0ca8-4b85-478b-b9c2-c61b42d93c89" Jan 24 00:51:59.620587 containerd[1682]: time="2026-01-24T00:51:59.620541338Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 24 00:51:59.748000 audit[6336]: USER_ACCT pid=6336 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:59.751249 sshd[6336]: Accepted publickey for core from 4.153.228.146 port 42424 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:51:59.756800 sshd-session[6336]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:51:59.764148 kernel: audit: type=1101 audit(1769215919.748:1176): pid=6336 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:59.752000 audit[6336]: CRED_ACQ pid=6336 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:59.775336 systemd-logind[1656]: New session 57 of user core. Jan 24 00:51:59.779499 kernel: audit: type=1103 audit(1769215919.752:1177): pid=6336 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:59.779615 kernel: audit: type=1006 audit(1769215919.752:1178): pid=6336 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=57 res=1 Jan 24 00:51:59.752000 audit[6336]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffff1852040 a2=3 a3=0 items=0 ppid=1 pid=6336 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=57 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:51:59.752000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:51:59.802286 kernel: audit: type=1300 audit(1769215919.752:1178): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffff1852040 a2=3 a3=0 items=0 ppid=1 pid=6336 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=57 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:51:59.802365 kernel: audit: type=1327 audit(1769215919.752:1178): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:51:59.803448 systemd[1]: Started session-57.scope - Session 57 of User core. Jan 24 00:51:59.813000 audit[6336]: USER_START pid=6336 uid=0 auid=500 ses=57 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:59.828135 kernel: audit: type=1105 audit(1769215919.813:1179): pid=6336 uid=0 auid=500 ses=57 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:59.818000 audit[6342]: CRED_ACQ pid=6342 uid=0 auid=500 ses=57 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:51:59.845212 kernel: audit: type=1103 audit(1769215919.818:1180): pid=6342 uid=0 auid=500 ses=57 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:00.057460 containerd[1682]: time="2026-01-24T00:52:00.057317670Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 24 00:52:00.058901 containerd[1682]: time="2026-01-24T00:52:00.058849000Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 24 00:52:00.059468 containerd[1682]: time="2026-01-24T00:52:00.058947541Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 24 00:52:00.059508 kubelet[2863]: E0124 00:52:00.059096 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 24 00:52:00.059508 kubelet[2863]: E0124 00:52:00.059142 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 24 00:52:00.059508 kubelet[2863]: E0124 00:52:00.059262 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fs8rv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-njf24_calico-system(12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 24 00:52:00.062965 containerd[1682]: time="2026-01-24T00:52:00.062710299Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 24 00:52:00.244110 sshd[6342]: Connection closed by 4.153.228.146 port 42424 Jan 24 00:52:00.244975 sshd-session[6336]: pam_unix(sshd:session): session closed for user core Jan 24 00:52:00.246000 audit[6336]: USER_END pid=6336 uid=0 auid=500 ses=57 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:00.266461 kernel: audit: type=1106 audit(1769215920.246:1181): pid=6336 uid=0 auid=500 ses=57 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:00.246000 audit[6336]: CRED_DISP pid=6336 uid=0 auid=500 ses=57 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:00.273486 systemd[1]: sshd@56-65.109.167.77:22-4.153.228.146:42424.service: Deactivated successfully. Jan 24 00:52:00.274666 systemd-logind[1656]: Session 57 logged out. Waiting for processes to exit. Jan 24 00:52:00.278670 systemd[1]: session-57.scope: Deactivated successfully. Jan 24 00:52:00.279214 kernel: audit: type=1104 audit(1769215920.246:1182): pid=6336 uid=0 auid=500 ses=57 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:00.274000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@56-65.109.167.77:22-4.153.228.146:42424 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:52:00.285932 systemd-logind[1656]: Removed session 57. Jan 24 00:52:00.496749 containerd[1682]: time="2026-01-24T00:52:00.496429810Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 24 00:52:00.498178 containerd[1682]: time="2026-01-24T00:52:00.498121169Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 24 00:52:00.498303 containerd[1682]: time="2026-01-24T00:52:00.498253849Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 24 00:52:00.498618 kubelet[2863]: E0124 00:52:00.498573 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 24 00:52:00.498618 kubelet[2863]: E0124 00:52:00.498618 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 24 00:52:00.498769 kubelet[2863]: E0124 00:52:00.498716 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fs8rv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-njf24_calico-system(12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 24 00:52:00.500025 kubelet[2863]: E0124 00:52:00.499998 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:52:03.622699 kubelet[2863]: E0124 00:52:03.622462 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58c54c478-wd6fd" podUID="dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5" Jan 24 00:52:05.373630 systemd[1]: Started sshd@57-65.109.167.77:22-4.153.228.146:41752.service - OpenSSH per-connection server daemon (4.153.228.146:41752). Jan 24 00:52:05.372000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@57-65.109.167.77:22-4.153.228.146:41752 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:52:05.376178 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:52:05.376335 kernel: audit: type=1130 audit(1769215925.372:1184): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@57-65.109.167.77:22-4.153.228.146:41752 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:52:05.616578 kubelet[2863]: E0124 00:52:05.616466 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" podUID="d83d59e3-6296-40d9-bb63-5a69b654ac0c" Jan 24 00:52:06.043000 audit[6378]: USER_ACCT pid=6378 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:06.049057 sshd-session[6378]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:52:06.053257 sshd[6378]: Accepted publickey for core from 4.153.228.146 port 41752 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:52:06.069989 kernel: audit: type=1101 audit(1769215926.043:1185): pid=6378 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:06.070150 kernel: audit: type=1103 audit(1769215926.045:1186): pid=6378 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:06.045000 audit[6378]: CRED_ACQ pid=6378 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:06.064504 systemd-logind[1656]: New session 58 of user core. Jan 24 00:52:06.082369 kernel: audit: type=1006 audit(1769215926.045:1187): pid=6378 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=58 res=1 Jan 24 00:52:06.082956 kernel: audit: type=1300 audit(1769215926.045:1187): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffeb52205d0 a2=3 a3=0 items=0 ppid=1 pid=6378 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=58 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:52:06.045000 audit[6378]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffeb52205d0 a2=3 a3=0 items=0 ppid=1 pid=6378 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=58 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:52:06.083569 systemd[1]: Started session-58.scope - Session 58 of User core. Jan 24 00:52:06.045000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:52:06.099298 kernel: audit: type=1327 audit(1769215926.045:1187): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:52:06.100000 audit[6378]: USER_START pid=6378 uid=0 auid=500 ses=58 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:06.107009 kernel: audit: type=1105 audit(1769215926.100:1188): pid=6378 uid=0 auid=500 ses=58 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:06.120303 kernel: audit: type=1103 audit(1769215926.105:1189): pid=6382 uid=0 auid=500 ses=58 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:06.105000 audit[6382]: CRED_ACQ pid=6382 uid=0 auid=500 ses=58 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:06.505886 sshd[6382]: Connection closed by 4.153.228.146 port 41752 Jan 24 00:52:06.507880 sshd-session[6378]: pam_unix(sshd:session): session closed for user core Jan 24 00:52:06.517964 kernel: audit: type=1106 audit(1769215926.508:1190): pid=6378 uid=0 auid=500 ses=58 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:06.508000 audit[6378]: USER_END pid=6378 uid=0 auid=500 ses=58 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:06.519322 systemd[1]: sshd@57-65.109.167.77:22-4.153.228.146:41752.service: Deactivated successfully. Jan 24 00:52:06.521805 systemd[1]: session-58.scope: Deactivated successfully. Jan 24 00:52:06.524271 systemd-logind[1656]: Session 58 logged out. Waiting for processes to exit. Jan 24 00:52:06.531352 kernel: audit: type=1104 audit(1769215926.515:1191): pid=6378 uid=0 auid=500 ses=58 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:06.515000 audit[6378]: CRED_DISP pid=6378 uid=0 auid=500 ses=58 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:06.532280 systemd-logind[1656]: Removed session 58. Jan 24 00:52:06.518000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@57-65.109.167.77:22-4.153.228.146:41752 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:52:06.617209 kubelet[2863]: E0124 00:52:06.617154 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" podUID="7fb26181-9fdc-4f96-be2c-85fbaa5f21b7" Jan 24 00:52:10.617218 kubelet[2863]: E0124 00:52:10.617158 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qhldj" podUID="d0cc0ca8-4b85-478b-b9c2-c61b42d93c89" Jan 24 00:52:11.643912 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:52:11.644005 kernel: audit: type=1130 audit(1769215931.635:1193): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@58-65.109.167.77:22-4.153.228.146:41758 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:52:11.635000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@58-65.109.167.77:22-4.153.228.146:41758 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:52:11.636272 systemd[1]: Started sshd@58-65.109.167.77:22-4.153.228.146:41758.service - OpenSSH per-connection server daemon (4.153.228.146:41758). Jan 24 00:52:12.302000 audit[6394]: USER_ACCT pid=6394 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:12.304414 sshd[6394]: Accepted publickey for core from 4.153.228.146 port 41758 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:52:12.309817 sshd-session[6394]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:52:12.323159 kernel: audit: type=1101 audit(1769215932.302:1194): pid=6394 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:12.302000 audit[6394]: CRED_ACQ pid=6394 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:12.325449 systemd-logind[1656]: New session 59 of user core. Jan 24 00:52:12.343189 kernel: audit: type=1103 audit(1769215932.302:1195): pid=6394 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:12.343274 kernel: audit: type=1006 audit(1769215932.302:1196): pid=6394 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=59 res=1 Jan 24 00:52:12.302000 audit[6394]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd3b40cb50 a2=3 a3=0 items=0 ppid=1 pid=6394 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=59 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:52:12.358959 kernel: audit: type=1300 audit(1769215932.302:1196): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd3b40cb50 a2=3 a3=0 items=0 ppid=1 pid=6394 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=59 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:52:12.359523 systemd[1]: Started session-59.scope - Session 59 of User core. Jan 24 00:52:12.302000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:52:12.371233 kernel: audit: type=1327 audit(1769215932.302:1196): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:52:12.379000 audit[6394]: USER_START pid=6394 uid=0 auid=500 ses=59 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:12.382000 audit[6398]: CRED_ACQ pid=6398 uid=0 auid=500 ses=59 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:12.391487 kernel: audit: type=1105 audit(1769215932.379:1197): pid=6394 uid=0 auid=500 ses=59 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:12.391687 kernel: audit: type=1103 audit(1769215932.382:1198): pid=6398 uid=0 auid=500 ses=59 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:12.782121 sshd[6398]: Connection closed by 4.153.228.146 port 41758 Jan 24 00:52:12.782262 sshd-session[6394]: pam_unix(sshd:session): session closed for user core Jan 24 00:52:12.783000 audit[6394]: USER_END pid=6394 uid=0 auid=500 ses=59 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:12.783000 audit[6394]: CRED_DISP pid=6394 uid=0 auid=500 ses=59 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:12.797645 kernel: audit: type=1106 audit(1769215932.783:1199): pid=6394 uid=0 auid=500 ses=59 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:12.797784 kernel: audit: type=1104 audit(1769215932.783:1200): pid=6394 uid=0 auid=500 ses=59 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:12.797416 systemd[1]: sshd@58-65.109.167.77:22-4.153.228.146:41758.service: Deactivated successfully. Jan 24 00:52:12.799000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@58-65.109.167.77:22-4.153.228.146:41758 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:52:12.803590 systemd[1]: session-59.scope: Deactivated successfully. Jan 24 00:52:12.805803 systemd-logind[1656]: Session 59 logged out. Waiting for processes to exit. Jan 24 00:52:12.806716 systemd-logind[1656]: Removed session 59. Jan 24 00:52:13.618757 kubelet[2863]: E0124 00:52:13.618593 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:52:14.617173 kubelet[2863]: E0124 00:52:14.617120 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58c54c478-wd6fd" podUID="dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5" Jan 24 00:52:14.618436 kubelet[2863]: E0124 00:52:14.618381 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:52:17.927673 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:52:17.927787 kernel: audit: type=1130 audit(1769215937.919:1202): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@59-65.109.167.77:22-4.153.228.146:36376 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:52:17.919000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@59-65.109.167.77:22-4.153.228.146:36376 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:52:17.920273 systemd[1]: Started sshd@59-65.109.167.77:22-4.153.228.146:36376.service - OpenSSH per-connection server daemon (4.153.228.146:36376). Jan 24 00:52:18.585000 audit[6410]: USER_ACCT pid=6410 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:18.595448 sshd-session[6410]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:52:18.598003 sshd[6410]: Accepted publickey for core from 4.153.228.146 port 36376 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:52:18.589000 audit[6410]: CRED_ACQ pid=6410 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:18.606912 kernel: audit: type=1101 audit(1769215938.585:1203): pid=6410 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:18.607472 kernel: audit: type=1103 audit(1769215938.589:1204): pid=6410 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:18.615840 systemd-logind[1656]: New session 60 of user core. Jan 24 00:52:18.622807 kernel: audit: type=1006 audit(1769215938.589:1205): pid=6410 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=60 res=1 Jan 24 00:52:18.589000 audit[6410]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffecbae1920 a2=3 a3=0 items=0 ppid=1 pid=6410 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=60 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:52:18.633571 kernel: audit: type=1300 audit(1769215938.589:1205): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffecbae1920 a2=3 a3=0 items=0 ppid=1 pid=6410 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=60 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:52:18.634378 systemd[1]: Started session-60.scope - Session 60 of User core. Jan 24 00:52:18.589000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:52:18.646237 kernel: audit: type=1327 audit(1769215938.589:1205): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:52:18.642000 audit[6410]: USER_START pid=6410 uid=0 auid=500 ses=60 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:18.654298 kernel: audit: type=1105 audit(1769215938.642:1206): pid=6410 uid=0 auid=500 ses=60 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:18.659000 audit[6414]: CRED_ACQ pid=6414 uid=0 auid=500 ses=60 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:18.686178 kernel: audit: type=1103 audit(1769215938.659:1207): pid=6414 uid=0 auid=500 ses=60 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:19.110348 sshd[6414]: Connection closed by 4.153.228.146 port 36376 Jan 24 00:52:19.111428 sshd-session[6410]: pam_unix(sshd:session): session closed for user core Jan 24 00:52:19.113000 audit[6410]: USER_END pid=6410 uid=0 auid=500 ses=60 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:19.120613 systemd[1]: sshd@59-65.109.167.77:22-4.153.228.146:36376.service: Deactivated successfully. Jan 24 00:52:19.121807 systemd-logind[1656]: Session 60 logged out. Waiting for processes to exit. Jan 24 00:52:19.128642 systemd[1]: session-60.scope: Deactivated successfully. Jan 24 00:52:19.133106 kernel: audit: type=1106 audit(1769215939.113:1208): pid=6410 uid=0 auid=500 ses=60 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:19.113000 audit[6410]: CRED_DISP pid=6410 uid=0 auid=500 ses=60 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:19.141156 systemd-logind[1656]: Removed session 60. Jan 24 00:52:19.148156 kernel: audit: type=1104 audit(1769215939.113:1209): pid=6410 uid=0 auid=500 ses=60 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:19.120000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@59-65.109.167.77:22-4.153.228.146:36376 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:52:20.616783 kubelet[2863]: E0124 00:52:20.616267 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" podUID="d83d59e3-6296-40d9-bb63-5a69b654ac0c" Jan 24 00:52:20.621567 kubelet[2863]: E0124 00:52:20.621323 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" podUID="7fb26181-9fdc-4f96-be2c-85fbaa5f21b7" Jan 24 00:52:22.616961 kubelet[2863]: E0124 00:52:22.616598 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qhldj" podUID="d0cc0ca8-4b85-478b-b9c2-c61b42d93c89" Jan 24 00:52:24.252852 systemd[1]: Started sshd@60-65.109.167.77:22-4.153.228.146:36390.service - OpenSSH per-connection server daemon (4.153.228.146:36390). Jan 24 00:52:24.251000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@60-65.109.167.77:22-4.153.228.146:36390 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:52:24.255178 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:52:24.255250 kernel: audit: type=1130 audit(1769215944.251:1211): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@60-65.109.167.77:22-4.153.228.146:36390 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:52:24.929000 audit[6428]: USER_ACCT pid=6428 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:24.933128 sshd-session[6428]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:52:24.935391 sshd[6428]: Accepted publickey for core from 4.153.228.146 port 36390 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:52:24.945226 systemd-logind[1656]: New session 61 of user core. Jan 24 00:52:24.946634 kernel: audit: type=1101 audit(1769215944.929:1212): pid=6428 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:24.946688 kernel: audit: type=1103 audit(1769215944.929:1213): pid=6428 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:24.929000 audit[6428]: CRED_ACQ pid=6428 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:24.961240 systemd[1]: Started session-61.scope - Session 61 of User core. Jan 24 00:52:24.963905 kernel: audit: type=1006 audit(1769215944.929:1214): pid=6428 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=61 res=1 Jan 24 00:52:24.929000 audit[6428]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc0847c680 a2=3 a3=0 items=0 ppid=1 pid=6428 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=61 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:52:24.975397 kernel: audit: type=1300 audit(1769215944.929:1214): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc0847c680 a2=3 a3=0 items=0 ppid=1 pid=6428 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=61 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:52:24.929000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:52:24.988693 kernel: audit: type=1327 audit(1769215944.929:1214): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:52:24.962000 audit[6428]: USER_START pid=6428 uid=0 auid=500 ses=61 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:24.995829 kernel: audit: type=1105 audit(1769215944.962:1215): pid=6428 uid=0 auid=500 ses=61 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:24.969000 audit[6432]: CRED_ACQ pid=6432 uid=0 auid=500 ses=61 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:25.009425 kernel: audit: type=1103 audit(1769215944.969:1216): pid=6432 uid=0 auid=500 ses=61 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:25.351823 sshd[6432]: Connection closed by 4.153.228.146 port 36390 Jan 24 00:52:25.353221 sshd-session[6428]: pam_unix(sshd:session): session closed for user core Jan 24 00:52:25.353000 audit[6428]: USER_END pid=6428 uid=0 auid=500 ses=61 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:25.357715 systemd[1]: sshd@60-65.109.167.77:22-4.153.228.146:36390.service: Deactivated successfully. Jan 24 00:52:25.360442 systemd[1]: session-61.scope: Deactivated successfully. Jan 24 00:52:25.361748 systemd-logind[1656]: Session 61 logged out. Waiting for processes to exit. Jan 24 00:52:25.353000 audit[6428]: CRED_DISP pid=6428 uid=0 auid=500 ses=61 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:25.367499 kernel: audit: type=1106 audit(1769215945.353:1217): pid=6428 uid=0 auid=500 ses=61 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:25.367537 kernel: audit: type=1104 audit(1769215945.353:1218): pid=6428 uid=0 auid=500 ses=61 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:25.365071 systemd-logind[1656]: Removed session 61. Jan 24 00:52:25.355000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@60-65.109.167.77:22-4.153.228.146:36390 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:52:28.617236 kubelet[2863]: E0124 00:52:28.617172 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:52:28.619913 kubelet[2863]: E0124 00:52:28.618585 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:52:29.617261 kubelet[2863]: E0124 00:52:29.617082 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58c54c478-wd6fd" podUID="dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5" Jan 24 00:52:30.500239 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:52:30.500563 kernel: audit: type=1130 audit(1769215950.489:1220): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@61-65.109.167.77:22-4.153.228.146:42876 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:52:30.489000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@61-65.109.167.77:22-4.153.228.146:42876 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:52:30.489133 systemd[1]: Started sshd@61-65.109.167.77:22-4.153.228.146:42876.service - OpenSSH per-connection server daemon (4.153.228.146:42876). Jan 24 00:52:31.175000 audit[6460]: USER_ACCT pid=6460 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:31.178765 sshd[6460]: Accepted publickey for core from 4.153.228.146 port 42876 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:52:31.180861 sshd-session[6460]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:52:31.177000 audit[6460]: CRED_ACQ pid=6460 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:31.185075 kernel: audit: type=1101 audit(1769215951.175:1221): pid=6460 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:31.185129 kernel: audit: type=1103 audit(1769215951.177:1222): pid=6460 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:31.189330 kernel: audit: type=1006 audit(1769215951.177:1223): pid=6460 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=62 res=1 Jan 24 00:52:31.196944 systemd-logind[1656]: New session 62 of user core. Jan 24 00:52:31.177000 audit[6460]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc8f6afae0 a2=3 a3=0 items=0 ppid=1 pid=6460 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=62 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:52:31.204431 systemd[1]: Started session-62.scope - Session 62 of User core. Jan 24 00:52:31.205084 kernel: audit: type=1300 audit(1769215951.177:1223): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc8f6afae0 a2=3 a3=0 items=0 ppid=1 pid=6460 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=62 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:52:31.177000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:52:31.209280 kernel: audit: type=1327 audit(1769215951.177:1223): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:52:31.209000 audit[6460]: USER_START pid=6460 uid=0 auid=500 ses=62 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:31.218084 kernel: audit: type=1105 audit(1769215951.209:1224): pid=6460 uid=0 auid=500 ses=62 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:31.218138 kernel: audit: type=1103 audit(1769215951.214:1225): pid=6465 uid=0 auid=500 ses=62 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:31.214000 audit[6465]: CRED_ACQ pid=6465 uid=0 auid=500 ses=62 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:31.608163 sshd[6465]: Connection closed by 4.153.228.146 port 42876 Jan 24 00:52:31.609325 sshd-session[6460]: pam_unix(sshd:session): session closed for user core Jan 24 00:52:31.610000 audit[6460]: USER_END pid=6460 uid=0 auid=500 ses=62 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:31.617298 systemd-logind[1656]: Session 62 logged out. Waiting for processes to exit. Jan 24 00:52:31.618950 systemd[1]: sshd@61-65.109.167.77:22-4.153.228.146:42876.service: Deactivated successfully. Jan 24 00:52:31.631918 kernel: audit: type=1106 audit(1769215951.610:1226): pid=6460 uid=0 auid=500 ses=62 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:31.629648 systemd[1]: session-62.scope: Deactivated successfully. Jan 24 00:52:31.610000 audit[6460]: CRED_DISP pid=6460 uid=0 auid=500 ses=62 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:31.639571 systemd-logind[1656]: Removed session 62. Jan 24 00:52:31.646111 kernel: audit: type=1104 audit(1769215951.610:1227): pid=6460 uid=0 auid=500 ses=62 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:31.615000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@61-65.109.167.77:22-4.153.228.146:42876 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:52:33.619684 kubelet[2863]: E0124 00:52:33.619607 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" podUID="7fb26181-9fdc-4f96-be2c-85fbaa5f21b7" Jan 24 00:52:33.620415 kubelet[2863]: E0124 00:52:33.619968 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" podUID="d83d59e3-6296-40d9-bb63-5a69b654ac0c" Jan 24 00:52:36.616960 kubelet[2863]: E0124 00:52:36.616676 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qhldj" podUID="d0cc0ca8-4b85-478b-b9c2-c61b42d93c89" Jan 24 00:52:36.746159 systemd[1]: Started sshd@62-65.109.167.77:22-4.153.228.146:44856.service - OpenSSH per-connection server daemon (4.153.228.146:44856). Jan 24 00:52:36.748377 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:52:36.748461 kernel: audit: type=1130 audit(1769215956.745:1229): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@62-65.109.167.77:22-4.153.228.146:44856 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:52:36.745000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@62-65.109.167.77:22-4.153.228.146:44856 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:52:37.442000 audit[6501]: USER_ACCT pid=6501 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:37.450076 kernel: audit: type=1101 audit(1769215957.442:1230): pid=6501 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:37.450892 sshd[6501]: Accepted publickey for core from 4.153.228.146 port 44856 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:52:37.452013 sshd-session[6501]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:52:37.449000 audit[6501]: CRED_ACQ pid=6501 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:37.461699 kernel: audit: type=1103 audit(1769215957.449:1231): pid=6501 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:37.461743 kernel: audit: type=1006 audit(1769215957.450:1232): pid=6501 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=63 res=1 Jan 24 00:52:37.463942 systemd-logind[1656]: New session 63 of user core. Jan 24 00:52:37.450000 audit[6501]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffde8fb1350 a2=3 a3=0 items=0 ppid=1 pid=6501 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=63 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:52:37.467373 kernel: audit: type=1300 audit(1769215957.450:1232): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffde8fb1350 a2=3 a3=0 items=0 ppid=1 pid=6501 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=63 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:52:37.450000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:52:37.476086 kernel: audit: type=1327 audit(1769215957.450:1232): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:52:37.478409 systemd[1]: Started session-63.scope - Session 63 of User core. Jan 24 00:52:37.482000 audit[6501]: USER_START pid=6501 uid=0 auid=500 ses=63 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:37.493085 kernel: audit: type=1105 audit(1769215957.482:1233): pid=6501 uid=0 auid=500 ses=63 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:37.482000 audit[6512]: CRED_ACQ pid=6512 uid=0 auid=500 ses=63 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:37.503089 kernel: audit: type=1103 audit(1769215957.482:1234): pid=6512 uid=0 auid=500 ses=63 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:37.926278 sshd[6512]: Connection closed by 4.153.228.146 port 44856 Jan 24 00:52:37.928285 sshd-session[6501]: pam_unix(sshd:session): session closed for user core Jan 24 00:52:37.930000 audit[6501]: USER_END pid=6501 uid=0 auid=500 ses=63 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:37.948170 kernel: audit: type=1106 audit(1769215957.930:1235): pid=6501 uid=0 auid=500 ses=63 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:37.951263 kernel: audit: type=1104 audit(1769215957.930:1236): pid=6501 uid=0 auid=500 ses=63 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:37.930000 audit[6501]: CRED_DISP pid=6501 uid=0 auid=500 ses=63 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:37.950306 systemd[1]: sshd@62-65.109.167.77:22-4.153.228.146:44856.service: Deactivated successfully. Jan 24 00:52:37.952000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@62-65.109.167.77:22-4.153.228.146:44856 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:52:37.958698 systemd[1]: session-63.scope: Deactivated successfully. Jan 24 00:52:37.959919 systemd-logind[1656]: Session 63 logged out. Waiting for processes to exit. Jan 24 00:52:37.961167 systemd-logind[1656]: Removed session 63. Jan 24 00:52:40.617303 kubelet[2863]: E0124 00:52:40.617185 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58c54c478-wd6fd" podUID="dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5" Jan 24 00:52:41.620755 kubelet[2863]: E0124 00:52:41.620680 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:52:41.623316 kubelet[2863]: E0124 00:52:41.623264 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:52:43.063673 systemd[1]: Started sshd@63-65.109.167.77:22-4.153.228.146:44866.service - OpenSSH per-connection server daemon (4.153.228.146:44866). Jan 24 00:52:43.066297 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:52:43.066349 kernel: audit: type=1130 audit(1769215963.062:1238): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@63-65.109.167.77:22-4.153.228.146:44866 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:52:43.062000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@63-65.109.167.77:22-4.153.228.146:44866 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:52:43.748000 audit[6525]: USER_ACCT pid=6525 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:43.755996 sshd[6525]: Accepted publickey for core from 4.153.228.146 port 44866 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:52:43.756294 kernel: audit: type=1101 audit(1769215963.748:1239): pid=6525 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:43.756494 sshd-session[6525]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:52:43.750000 audit[6525]: CRED_ACQ pid=6525 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:43.769930 kernel: audit: type=1103 audit(1769215963.750:1240): pid=6525 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:43.769988 kernel: audit: type=1006 audit(1769215963.750:1241): pid=6525 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=64 res=1 Jan 24 00:52:43.770042 systemd-logind[1656]: New session 64 of user core. Jan 24 00:52:43.750000 audit[6525]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffed4eceb80 a2=3 a3=0 items=0 ppid=1 pid=6525 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=64 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:52:43.781120 kernel: audit: type=1300 audit(1769215963.750:1241): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffed4eceb80 a2=3 a3=0 items=0 ppid=1 pid=6525 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=64 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:52:43.781225 kernel: audit: type=1327 audit(1769215963.750:1241): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:52:43.750000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:52:43.782892 systemd[1]: Started session-64.scope - Session 64 of User core. Jan 24 00:52:43.787000 audit[6525]: USER_START pid=6525 uid=0 auid=500 ses=64 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:43.798355 kernel: audit: type=1105 audit(1769215963.787:1242): pid=6525 uid=0 auid=500 ses=64 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:43.798000 audit[6529]: CRED_ACQ pid=6529 uid=0 auid=500 ses=64 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:43.806142 kernel: audit: type=1103 audit(1769215963.798:1243): pid=6529 uid=0 auid=500 ses=64 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:44.247595 sshd[6529]: Connection closed by 4.153.228.146 port 44866 Jan 24 00:52:44.249399 sshd-session[6525]: pam_unix(sshd:session): session closed for user core Jan 24 00:52:44.250000 audit[6525]: USER_END pid=6525 uid=0 auid=500 ses=64 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:44.250000 audit[6525]: CRED_DISP pid=6525 uid=0 auid=500 ses=64 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:44.262156 kernel: audit: type=1106 audit(1769215964.250:1244): pid=6525 uid=0 auid=500 ses=64 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:44.262207 kernel: audit: type=1104 audit(1769215964.250:1245): pid=6525 uid=0 auid=500 ses=64 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:44.266176 systemd[1]: sshd@63-65.109.167.77:22-4.153.228.146:44866.service: Deactivated successfully. Jan 24 00:52:44.265000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@63-65.109.167.77:22-4.153.228.146:44866 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:52:44.268713 systemd[1]: session-64.scope: Deactivated successfully. Jan 24 00:52:44.270754 systemd-logind[1656]: Session 64 logged out. Waiting for processes to exit. Jan 24 00:52:44.273415 systemd-logind[1656]: Removed session 64. Jan 24 00:52:47.615826 kubelet[2863]: E0124 00:52:47.615551 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" podUID="7fb26181-9fdc-4f96-be2c-85fbaa5f21b7" Jan 24 00:52:48.615798 kubelet[2863]: E0124 00:52:48.615726 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" podUID="d83d59e3-6296-40d9-bb63-5a69b654ac0c" Jan 24 00:52:49.397100 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:52:49.397240 kernel: audit: type=1130 audit(1769215969.385:1247): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@64-65.109.167.77:22-4.153.228.146:36494 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:52:49.385000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@64-65.109.167.77:22-4.153.228.146:36494 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:52:49.386823 systemd[1]: Started sshd@64-65.109.167.77:22-4.153.228.146:36494.service - OpenSSH per-connection server daemon (4.153.228.146:36494). Jan 24 00:52:50.074000 audit[6542]: USER_ACCT pid=6542 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:50.076987 sshd[6542]: Accepted publickey for core from 4.153.228.146 port 36494 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:52:50.078401 sshd-session[6542]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:52:50.084099 systemd-logind[1656]: New session 65 of user core. Jan 24 00:52:50.076000 audit[6542]: CRED_ACQ pid=6542 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:50.089990 kernel: audit: type=1101 audit(1769215970.074:1248): pid=6542 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:50.090039 kernel: audit: type=1103 audit(1769215970.076:1249): pid=6542 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:50.090293 systemd[1]: Started session-65.scope - Session 65 of User core. Jan 24 00:52:50.104123 kernel: audit: type=1006 audit(1769215970.076:1250): pid=6542 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=65 res=1 Jan 24 00:52:50.076000 audit[6542]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc481a0ff0 a2=3 a3=0 items=0 ppid=1 pid=6542 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=65 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:52:50.117094 kernel: audit: type=1300 audit(1769215970.076:1250): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc481a0ff0 a2=3 a3=0 items=0 ppid=1 pid=6542 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=65 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:52:50.076000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:52:50.123086 kernel: audit: type=1327 audit(1769215970.076:1250): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:52:50.096000 audit[6542]: USER_START pid=6542 uid=0 auid=500 ses=65 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:50.098000 audit[6546]: CRED_ACQ pid=6546 uid=0 auid=500 ses=65 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:50.140674 kernel: audit: type=1105 audit(1769215970.096:1251): pid=6542 uid=0 auid=500 ses=65 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:50.140733 kernel: audit: type=1103 audit(1769215970.098:1252): pid=6546 uid=0 auid=500 ses=65 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:50.553697 sshd[6546]: Connection closed by 4.153.228.146 port 36494 Jan 24 00:52:50.554269 sshd-session[6542]: pam_unix(sshd:session): session closed for user core Jan 24 00:52:50.554000 audit[6542]: USER_END pid=6542 uid=0 auid=500 ses=65 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:50.561436 systemd[1]: sshd@64-65.109.167.77:22-4.153.228.146:36494.service: Deactivated successfully. Jan 24 00:52:50.563101 kernel: audit: type=1106 audit(1769215970.554:1253): pid=6542 uid=0 auid=500 ses=65 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:50.554000 audit[6542]: CRED_DISP pid=6542 uid=0 auid=500 ses=65 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:50.566842 systemd[1]: session-65.scope: Deactivated successfully. Jan 24 00:52:50.556000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@64-65.109.167.77:22-4.153.228.146:36494 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:52:50.569602 systemd-logind[1656]: Session 65 logged out. Waiting for processes to exit. Jan 24 00:52:50.570107 kernel: audit: type=1104 audit(1769215970.554:1254): pid=6542 uid=0 auid=500 ses=65 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:50.573481 systemd-logind[1656]: Removed session 65. Jan 24 00:52:50.616906 kubelet[2863]: E0124 00:52:50.616824 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qhldj" podUID="d0cc0ca8-4b85-478b-b9c2-c61b42d93c89" Jan 24 00:52:51.617743 kubelet[2863]: E0124 00:52:51.617688 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58c54c478-wd6fd" podUID="dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5" Jan 24 00:52:52.618237 kubelet[2863]: E0124 00:52:52.618044 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:52:53.618605 kubelet[2863]: E0124 00:52:53.618364 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:52:55.685493 systemd[1]: Started sshd@65-65.109.167.77:22-4.153.228.146:37006.service - OpenSSH per-connection server daemon (4.153.228.146:37006). Jan 24 00:52:55.690962 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:52:55.691034 kernel: audit: type=1130 audit(1769215975.685:1256): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@65-65.109.167.77:22-4.153.228.146:37006 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:52:55.685000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@65-65.109.167.77:22-4.153.228.146:37006 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:52:56.369000 audit[6558]: USER_ACCT pid=6558 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:56.377144 kernel: audit: type=1101 audit(1769215976.369:1257): pid=6558 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:56.377746 sshd[6558]: Accepted publickey for core from 4.153.228.146 port 37006 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:52:56.379289 sshd-session[6558]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:52:56.377000 audit[6558]: CRED_ACQ pid=6558 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:56.385229 kernel: audit: type=1103 audit(1769215976.377:1258): pid=6558 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:56.390254 systemd-logind[1656]: New session 66 of user core. Jan 24 00:52:56.400396 kernel: audit: type=1006 audit(1769215976.377:1259): pid=6558 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=66 res=1 Jan 24 00:52:56.400454 kernel: audit: type=1300 audit(1769215976.377:1259): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffeafcca680 a2=3 a3=0 items=0 ppid=1 pid=6558 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=66 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:52:56.377000 audit[6558]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffeafcca680 a2=3 a3=0 items=0 ppid=1 pid=6558 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=66 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:52:56.401137 kernel: audit: type=1327 audit(1769215976.377:1259): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:52:56.377000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:52:56.400714 systemd[1]: Started session-66.scope - Session 66 of User core. Jan 24 00:52:56.404000 audit[6558]: USER_START pid=6558 uid=0 auid=500 ses=66 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:56.412163 kernel: audit: type=1105 audit(1769215976.404:1260): pid=6558 uid=0 auid=500 ses=66 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:56.414000 audit[6562]: CRED_ACQ pid=6562 uid=0 auid=500 ses=66 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:56.421087 kernel: audit: type=1103 audit(1769215976.414:1261): pid=6562 uid=0 auid=500 ses=66 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:56.873208 sshd[6562]: Connection closed by 4.153.228.146 port 37006 Jan 24 00:52:56.875387 sshd-session[6558]: pam_unix(sshd:session): session closed for user core Jan 24 00:52:56.878000 audit[6558]: USER_END pid=6558 uid=0 auid=500 ses=66 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:56.896172 kernel: audit: type=1106 audit(1769215976.878:1262): pid=6558 uid=0 auid=500 ses=66 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:56.898213 systemd[1]: sshd@65-65.109.167.77:22-4.153.228.146:37006.service: Deactivated successfully. Jan 24 00:52:56.902039 systemd[1]: session-66.scope: Deactivated successfully. Jan 24 00:52:56.878000 audit[6558]: CRED_DISP pid=6558 uid=0 auid=500 ses=66 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:52:56.904910 systemd-logind[1656]: Session 66 logged out. Waiting for processes to exit. Jan 24 00:52:56.911773 systemd-logind[1656]: Removed session 66. Jan 24 00:52:56.897000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@65-65.109.167.77:22-4.153.228.146:37006 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:52:56.916157 kernel: audit: type=1104 audit(1769215976.878:1263): pid=6558 uid=0 auid=500 ses=66 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:01.615996 kubelet[2863]: E0124 00:53:01.615766 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" podUID="7fb26181-9fdc-4f96-be2c-85fbaa5f21b7" Jan 24 00:53:02.011000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@66-65.109.167.77:22-4.153.228.146:37010 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:53:02.014862 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:53:02.015011 kernel: audit: type=1130 audit(1769215982.011:1265): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@66-65.109.167.77:22-4.153.228.146:37010 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:53:02.012332 systemd[1]: Started sshd@66-65.109.167.77:22-4.153.228.146:37010.service - OpenSSH per-connection server daemon (4.153.228.146:37010). Jan 24 00:53:02.616696 kubelet[2863]: E0124 00:53:02.616639 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" podUID="d83d59e3-6296-40d9-bb63-5a69b654ac0c" Jan 24 00:53:02.618414 kubelet[2863]: E0124 00:53:02.618341 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58c54c478-wd6fd" podUID="dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5" Jan 24 00:53:02.721722 sshd[6576]: Accepted publickey for core from 4.153.228.146 port 37010 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:53:02.720000 audit[6576]: USER_ACCT pid=6576 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:02.736091 kernel: audit: type=1101 audit(1769215982.720:1266): pid=6576 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:02.742767 sshd-session[6576]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:53:02.739000 audit[6576]: CRED_ACQ pid=6576 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:02.757101 kernel: audit: type=1103 audit(1769215982.739:1267): pid=6576 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:02.761150 systemd-logind[1656]: New session 67 of user core. Jan 24 00:53:02.764076 kernel: audit: type=1006 audit(1769215982.740:1268): pid=6576 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=67 res=1 Jan 24 00:53:02.764123 kernel: audit: type=1300 audit(1769215982.740:1268): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd112750d0 a2=3 a3=0 items=0 ppid=1 pid=6576 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=67 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:53:02.740000 audit[6576]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd112750d0 a2=3 a3=0 items=0 ppid=1 pid=6576 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=67 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:53:02.740000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:53:02.772088 kernel: audit: type=1327 audit(1769215982.740:1268): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:53:02.773322 systemd[1]: Started session-67.scope - Session 67 of User core. Jan 24 00:53:02.776000 audit[6576]: USER_START pid=6576 uid=0 auid=500 ses=67 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:02.785114 kernel: audit: type=1105 audit(1769215982.776:1269): pid=6576 uid=0 auid=500 ses=67 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:02.781000 audit[6580]: CRED_ACQ pid=6580 uid=0 auid=500 ses=67 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:02.797081 kernel: audit: type=1103 audit(1769215982.781:1270): pid=6580 uid=0 auid=500 ses=67 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:03.150631 sshd[6580]: Connection closed by 4.153.228.146 port 37010 Jan 24 00:53:03.152214 sshd-session[6576]: pam_unix(sshd:session): session closed for user core Jan 24 00:53:03.152000 audit[6576]: USER_END pid=6576 uid=0 auid=500 ses=67 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:03.156356 systemd[1]: sshd@66-65.109.167.77:22-4.153.228.146:37010.service: Deactivated successfully. Jan 24 00:53:03.158538 systemd[1]: session-67.scope: Deactivated successfully. Jan 24 00:53:03.152000 audit[6576]: CRED_DISP pid=6576 uid=0 auid=500 ses=67 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:03.160891 systemd-logind[1656]: Session 67 logged out. Waiting for processes to exit. Jan 24 00:53:03.161489 kernel: audit: type=1106 audit(1769215983.152:1271): pid=6576 uid=0 auid=500 ses=67 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:03.161539 kernel: audit: type=1104 audit(1769215983.152:1272): pid=6576 uid=0 auid=500 ses=67 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:03.161871 systemd-logind[1656]: Removed session 67. Jan 24 00:53:03.154000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@66-65.109.167.77:22-4.153.228.146:37010 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:53:03.620193 kubelet[2863]: E0124 00:53:03.620145 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qhldj" podUID="d0cc0ca8-4b85-478b-b9c2-c61b42d93c89" Jan 24 00:53:03.622082 kubelet[2863]: E0124 00:53:03.622024 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:53:07.618349 kubelet[2863]: E0124 00:53:07.618295 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:53:08.287554 systemd[1]: Started sshd@67-65.109.167.77:22-4.153.228.146:40880.service - OpenSSH per-connection server daemon (4.153.228.146:40880). Jan 24 00:53:08.290055 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:53:08.291001 kernel: audit: type=1130 audit(1769215988.286:1274): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@67-65.109.167.77:22-4.153.228.146:40880 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:53:08.286000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@67-65.109.167.77:22-4.153.228.146:40880 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:53:08.982000 audit[6616]: USER_ACCT pid=6616 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:09.000366 kernel: audit: type=1101 audit(1769215988.982:1275): pid=6616 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:09.000476 sshd[6616]: Accepted publickey for core from 4.153.228.146 port 40880 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:53:09.005197 sshd-session[6616]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:53:09.001000 audit[6616]: CRED_ACQ pid=6616 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:09.019106 kernel: audit: type=1103 audit(1769215989.001:1276): pid=6616 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:09.028251 systemd-logind[1656]: New session 68 of user core. Jan 24 00:53:09.044140 kernel: audit: type=1006 audit(1769215989.001:1277): pid=6616 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=68 res=1 Jan 24 00:53:09.044224 kernel: audit: type=1300 audit(1769215989.001:1277): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff338c6330 a2=3 a3=0 items=0 ppid=1 pid=6616 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=68 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:53:09.001000 audit[6616]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff338c6330 a2=3 a3=0 items=0 ppid=1 pid=6616 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=68 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:53:09.044520 systemd[1]: Started session-68.scope - Session 68 of User core. Jan 24 00:53:09.001000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:53:09.061580 kernel: audit: type=1327 audit(1769215989.001:1277): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:53:09.061667 kernel: audit: type=1105 audit(1769215989.047:1278): pid=6616 uid=0 auid=500 ses=68 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:09.047000 audit[6616]: USER_START pid=6616 uid=0 auid=500 ses=68 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:09.065105 kernel: audit: type=1103 audit(1769215989.051:1279): pid=6620 uid=0 auid=500 ses=68 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:09.051000 audit[6620]: CRED_ACQ pid=6620 uid=0 auid=500 ses=68 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:09.453924 sshd[6620]: Connection closed by 4.153.228.146 port 40880 Jan 24 00:53:09.455289 sshd-session[6616]: pam_unix(sshd:session): session closed for user core Jan 24 00:53:09.465578 kernel: audit: type=1106 audit(1769215989.456:1280): pid=6616 uid=0 auid=500 ses=68 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:09.456000 audit[6616]: USER_END pid=6616 uid=0 auid=500 ses=68 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:09.460468 systemd[1]: sshd@67-65.109.167.77:22-4.153.228.146:40880.service: Deactivated successfully. Jan 24 00:53:09.462919 systemd[1]: session-68.scope: Deactivated successfully. Jan 24 00:53:09.456000 audit[6616]: CRED_DISP pid=6616 uid=0 auid=500 ses=68 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:09.467439 systemd-logind[1656]: Session 68 logged out. Waiting for processes to exit. Jan 24 00:53:09.468356 systemd-logind[1656]: Removed session 68. Jan 24 00:53:09.458000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@67-65.109.167.77:22-4.153.228.146:40880 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:53:09.472779 kernel: audit: type=1104 audit(1769215989.456:1281): pid=6616 uid=0 auid=500 ses=68 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:14.590308 systemd[1]: Started sshd@68-65.109.167.77:22-4.153.228.146:50994.service - OpenSSH per-connection server daemon (4.153.228.146:50994). Jan 24 00:53:14.590000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@68-65.109.167.77:22-4.153.228.146:50994 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:53:14.595400 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:53:14.595513 kernel: audit: type=1130 audit(1769215994.590:1283): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@68-65.109.167.77:22-4.153.228.146:50994 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:53:14.619758 kubelet[2863]: E0124 00:53:14.619636 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58c54c478-wd6fd" podUID="dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5" Jan 24 00:53:15.293000 audit[6632]: USER_ACCT pid=6632 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:15.305044 sshd[6632]: Accepted publickey for core from 4.153.228.146 port 50994 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:53:15.302000 audit[6632]: CRED_ACQ pid=6632 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:15.309298 kernel: audit: type=1101 audit(1769215995.293:1284): pid=6632 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:15.309435 kernel: audit: type=1103 audit(1769215995.302:1285): pid=6632 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:15.309971 sshd-session[6632]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:53:15.319849 kernel: audit: type=1006 audit(1769215995.302:1286): pid=6632 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=69 res=1 Jan 24 00:53:15.325553 kernel: audit: type=1300 audit(1769215995.302:1286): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffec4aa2080 a2=3 a3=0 items=0 ppid=1 pid=6632 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=69 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:53:15.302000 audit[6632]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffec4aa2080 a2=3 a3=0 items=0 ppid=1 pid=6632 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=69 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:53:15.302000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:53:15.341123 kernel: audit: type=1327 audit(1769215995.302:1286): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:53:15.348515 systemd-logind[1656]: New session 69 of user core. Jan 24 00:53:15.352207 systemd[1]: Started session-69.scope - Session 69 of User core. Jan 24 00:53:15.360000 audit[6632]: USER_START pid=6632 uid=0 auid=500 ses=69 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:15.369122 kernel: audit: type=1105 audit(1769215995.360:1287): pid=6632 uid=0 auid=500 ses=69 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:15.368000 audit[6636]: CRED_ACQ pid=6636 uid=0 auid=500 ses=69 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:15.376162 kernel: audit: type=1103 audit(1769215995.368:1288): pid=6636 uid=0 auid=500 ses=69 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:15.617530 kubelet[2863]: E0124 00:53:15.617411 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" podUID="d83d59e3-6296-40d9-bb63-5a69b654ac0c" Jan 24 00:53:15.618056 kubelet[2863]: E0124 00:53:15.617855 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" podUID="7fb26181-9fdc-4f96-be2c-85fbaa5f21b7" Jan 24 00:53:15.725181 sshd[6636]: Connection closed by 4.153.228.146 port 50994 Jan 24 00:53:15.727394 sshd-session[6632]: pam_unix(sshd:session): session closed for user core Jan 24 00:53:15.736114 kernel: audit: type=1106 audit(1769215995.727:1289): pid=6632 uid=0 auid=500 ses=69 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:15.727000 audit[6632]: USER_END pid=6632 uid=0 auid=500 ses=69 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:15.733759 systemd[1]: sshd@68-65.109.167.77:22-4.153.228.146:50994.service: Deactivated successfully. Jan 24 00:53:15.728000 audit[6632]: CRED_DISP pid=6632 uid=0 auid=500 ses=69 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:15.737548 systemd[1]: session-69.scope: Deactivated successfully. Jan 24 00:53:15.743600 systemd-logind[1656]: Session 69 logged out. Waiting for processes to exit. Jan 24 00:53:15.730000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@68-65.109.167.77:22-4.153.228.146:50994 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:53:15.744081 kernel: audit: type=1104 audit(1769215995.728:1290): pid=6632 uid=0 auid=500 ses=69 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:15.744568 systemd-logind[1656]: Removed session 69. Jan 24 00:53:16.616473 kubelet[2863]: E0124 00:53:16.616402 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qhldj" podUID="d0cc0ca8-4b85-478b-b9c2-c61b42d93c89" Jan 24 00:53:17.617184 kubelet[2863]: E0124 00:53:17.617048 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:53:18.617379 kubelet[2863]: E0124 00:53:18.617308 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:53:20.869357 systemd[1]: Started sshd@69-65.109.167.77:22-4.153.228.146:51000.service - OpenSSH per-connection server daemon (4.153.228.146:51000). Jan 24 00:53:20.878137 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:53:20.878192 kernel: audit: type=1130 audit(1769216000.869:1292): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@69-65.109.167.77:22-4.153.228.146:51000 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:53:20.869000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@69-65.109.167.77:22-4.153.228.146:51000 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:53:21.557000 audit[6648]: USER_ACCT pid=6648 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:21.563458 sshd-session[6648]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:53:21.564984 sshd[6648]: Accepted publickey for core from 4.153.228.146 port 51000 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:53:21.565130 kernel: audit: type=1101 audit(1769216001.557:1293): pid=6648 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:21.559000 audit[6648]: CRED_ACQ pid=6648 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:21.572173 kernel: audit: type=1103 audit(1769216001.559:1294): pid=6648 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:21.572328 kernel: audit: type=1006 audit(1769216001.559:1295): pid=6648 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=70 res=1 Jan 24 00:53:21.559000 audit[6648]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc2e0a3a70 a2=3 a3=0 items=0 ppid=1 pid=6648 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=70 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:53:21.577805 kernel: audit: type=1300 audit(1769216001.559:1295): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc2e0a3a70 a2=3 a3=0 items=0 ppid=1 pid=6648 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=70 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:53:21.559000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:53:21.584827 kernel: audit: type=1327 audit(1769216001.559:1295): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:53:21.584418 systemd-logind[1656]: New session 70 of user core. Jan 24 00:53:21.591349 systemd[1]: Started session-70.scope - Session 70 of User core. Jan 24 00:53:21.598000 audit[6648]: USER_START pid=6648 uid=0 auid=500 ses=70 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:21.601000 audit[6652]: CRED_ACQ pid=6652 uid=0 auid=500 ses=70 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:21.609279 kernel: audit: type=1105 audit(1769216001.598:1296): pid=6648 uid=0 auid=500 ses=70 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:21.609336 kernel: audit: type=1103 audit(1769216001.601:1297): pid=6652 uid=0 auid=500 ses=70 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:22.065652 sshd[6652]: Connection closed by 4.153.228.146 port 51000 Jan 24 00:53:22.065973 sshd-session[6648]: pam_unix(sshd:session): session closed for user core Jan 24 00:53:22.066000 audit[6648]: USER_END pid=6648 uid=0 auid=500 ses=70 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:22.070749 systemd[1]: sshd@69-65.109.167.77:22-4.153.228.146:51000.service: Deactivated successfully. Jan 24 00:53:22.071158 systemd-logind[1656]: Session 70 logged out. Waiting for processes to exit. Jan 24 00:53:22.073691 systemd[1]: session-70.scope: Deactivated successfully. Jan 24 00:53:22.076202 kernel: audit: type=1106 audit(1769216002.066:1298): pid=6648 uid=0 auid=500 ses=70 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:22.077039 systemd-logind[1656]: Removed session 70. Jan 24 00:53:22.066000 audit[6648]: CRED_DISP pid=6648 uid=0 auid=500 ses=70 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:22.069000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@69-65.109.167.77:22-4.153.228.146:51000 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:53:22.084109 kernel: audit: type=1104 audit(1769216002.066:1299): pid=6648 uid=0 auid=500 ses=70 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:26.616667 kubelet[2863]: E0124 00:53:26.616296 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" podUID="d83d59e3-6296-40d9-bb63-5a69b654ac0c" Jan 24 00:53:27.205015 systemd[1]: Started sshd@70-65.109.167.77:22-4.153.228.146:56502.service - OpenSSH per-connection server daemon (4.153.228.146:56502). Jan 24 00:53:27.213997 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:53:27.214155 kernel: audit: type=1130 audit(1769216007.204:1301): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@70-65.109.167.77:22-4.153.228.146:56502 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:53:27.204000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@70-65.109.167.77:22-4.153.228.146:56502 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:53:27.616946 kubelet[2863]: E0124 00:53:27.616661 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" podUID="7fb26181-9fdc-4f96-be2c-85fbaa5f21b7" Jan 24 00:53:27.905000 audit[6666]: USER_ACCT pid=6666 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:27.911003 sshd-session[6666]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:53:27.915430 kernel: audit: type=1101 audit(1769216007.905:1302): pid=6666 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:27.915473 sshd[6666]: Accepted publickey for core from 4.153.228.146 port 56502 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:53:27.908000 audit[6666]: CRED_ACQ pid=6666 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:27.923866 systemd-logind[1656]: New session 71 of user core. Jan 24 00:53:27.929093 kernel: audit: type=1103 audit(1769216007.908:1303): pid=6666 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:27.931630 systemd[1]: Started session-71.scope - Session 71 of User core. Jan 24 00:53:27.936395 kernel: audit: type=1006 audit(1769216007.908:1304): pid=6666 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=71 res=1 Jan 24 00:53:27.908000 audit[6666]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd0569fc50 a2=3 a3=0 items=0 ppid=1 pid=6666 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=71 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:53:27.946245 kernel: audit: type=1300 audit(1769216007.908:1304): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd0569fc50 a2=3 a3=0 items=0 ppid=1 pid=6666 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=71 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:53:27.908000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:53:27.960046 kernel: audit: type=1327 audit(1769216007.908:1304): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:53:27.960127 kernel: audit: type=1105 audit(1769216007.936:1305): pid=6666 uid=0 auid=500 ses=71 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:27.936000 audit[6666]: USER_START pid=6666 uid=0 auid=500 ses=71 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:27.937000 audit[6670]: CRED_ACQ pid=6670 uid=0 auid=500 ses=71 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:27.970083 kernel: audit: type=1103 audit(1769216007.937:1306): pid=6670 uid=0 auid=500 ses=71 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:28.332778 sshd[6670]: Connection closed by 4.153.228.146 port 56502 Jan 24 00:53:28.335998 sshd-session[6666]: pam_unix(sshd:session): session closed for user core Jan 24 00:53:28.336000 audit[6666]: USER_END pid=6666 uid=0 auid=500 ses=71 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:28.340576 systemd-logind[1656]: Session 71 logged out. Waiting for processes to exit. Jan 24 00:53:28.341746 systemd[1]: sshd@70-65.109.167.77:22-4.153.228.146:56502.service: Deactivated successfully. Jan 24 00:53:28.345253 systemd[1]: session-71.scope: Deactivated successfully. Jan 24 00:53:28.346104 kernel: audit: type=1106 audit(1769216008.336:1307): pid=6666 uid=0 auid=500 ses=71 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:28.336000 audit[6666]: CRED_DISP pid=6666 uid=0 auid=500 ses=71 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:28.349756 systemd-logind[1656]: Removed session 71. Jan 24 00:53:28.355079 kernel: audit: type=1104 audit(1769216008.336:1308): pid=6666 uid=0 auid=500 ses=71 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:28.342000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@70-65.109.167.77:22-4.153.228.146:56502 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:53:28.617127 kubelet[2863]: E0124 00:53:28.616971 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58c54c478-wd6fd" podUID="dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5" Jan 24 00:53:30.618188 kubelet[2863]: E0124 00:53:30.618008 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:53:31.617566 kubelet[2863]: E0124 00:53:31.617341 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qhldj" podUID="d0cc0ca8-4b85-478b-b9c2-c61b42d93c89" Jan 24 00:53:33.476000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@71-65.109.167.77:22-4.153.228.146:56506 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:53:33.477294 systemd[1]: Started sshd@71-65.109.167.77:22-4.153.228.146:56506.service - OpenSSH per-connection server daemon (4.153.228.146:56506). Jan 24 00:53:33.481097 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:53:33.481188 kernel: audit: type=1130 audit(1769216013.476:1310): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@71-65.109.167.77:22-4.153.228.146:56506 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:53:33.617629 kubelet[2863]: E0124 00:53:33.617222 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:53:34.161000 audit[6708]: USER_ACCT pid=6708 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:34.168023 sshd-session[6708]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:53:34.170210 sshd[6708]: Accepted publickey for core from 4.153.228.146 port 56506 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:53:34.177235 kernel: audit: type=1101 audit(1769216014.161:1311): pid=6708 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:34.192819 kernel: audit: type=1103 audit(1769216014.164:1312): pid=6708 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:34.164000 audit[6708]: CRED_ACQ pid=6708 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:34.203115 kernel: audit: type=1006 audit(1769216014.164:1313): pid=6708 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=72 res=1 Jan 24 00:53:34.164000 audit[6708]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffdf306a920 a2=3 a3=0 items=0 ppid=1 pid=6708 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=72 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:53:34.219364 systemd-logind[1656]: New session 72 of user core. Jan 24 00:53:34.164000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:53:34.222741 kernel: audit: type=1300 audit(1769216014.164:1313): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffdf306a920 a2=3 a3=0 items=0 ppid=1 pid=6708 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=72 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:53:34.223832 kernel: audit: type=1327 audit(1769216014.164:1313): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:53:34.227374 systemd[1]: Started session-72.scope - Session 72 of User core. Jan 24 00:53:34.234000 audit[6708]: USER_START pid=6708 uid=0 auid=500 ses=72 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:34.243000 audit[6712]: CRED_ACQ pid=6712 uid=0 auid=500 ses=72 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:34.253319 kernel: audit: type=1105 audit(1769216014.234:1314): pid=6708 uid=0 auid=500 ses=72 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:34.253395 kernel: audit: type=1103 audit(1769216014.243:1315): pid=6712 uid=0 auid=500 ses=72 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:34.656217 sshd[6712]: Connection closed by 4.153.228.146 port 56506 Jan 24 00:53:34.658575 sshd-session[6708]: pam_unix(sshd:session): session closed for user core Jan 24 00:53:34.659000 audit[6708]: USER_END pid=6708 uid=0 auid=500 ses=72 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:34.678105 kernel: audit: type=1106 audit(1769216014.659:1316): pid=6708 uid=0 auid=500 ses=72 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:34.676000 audit[6708]: CRED_DISP pid=6708 uid=0 auid=500 ses=72 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:34.681827 systemd[1]: sshd@71-65.109.167.77:22-4.153.228.146:56506.service: Deactivated successfully. Jan 24 00:53:34.685778 systemd[1]: session-72.scope: Deactivated successfully. Jan 24 00:53:34.692543 systemd-logind[1656]: Session 72 logged out. Waiting for processes to exit. Jan 24 00:53:34.693087 kernel: audit: type=1104 audit(1769216014.676:1317): pid=6708 uid=0 auid=500 ses=72 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:34.681000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@71-65.109.167.77:22-4.153.228.146:56506 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:53:34.695018 systemd-logind[1656]: Removed session 72. Jan 24 00:53:37.618794 kubelet[2863]: E0124 00:53:37.618624 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" podUID="d83d59e3-6296-40d9-bb63-5a69b654ac0c" Jan 24 00:53:37.814097 update_engine[1659]: I20260124 00:53:37.813241 1659 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Jan 24 00:53:37.814097 update_engine[1659]: I20260124 00:53:37.813324 1659 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Jan 24 00:53:37.814097 update_engine[1659]: I20260124 00:53:37.813781 1659 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Jan 24 00:53:37.815701 update_engine[1659]: I20260124 00:53:37.815312 1659 omaha_request_params.cc:62] Current group set to alpha Jan 24 00:53:37.820398 update_engine[1659]: I20260124 00:53:37.817911 1659 update_attempter.cc:499] Already updated boot flags. Skipping. Jan 24 00:53:37.820398 update_engine[1659]: I20260124 00:53:37.819128 1659 update_attempter.cc:643] Scheduling an action processor start. Jan 24 00:53:37.820398 update_engine[1659]: I20260124 00:53:37.819204 1659 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Jan 24 00:53:37.829498 update_engine[1659]: I20260124 00:53:37.828222 1659 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Jan 24 00:53:37.829806 update_engine[1659]: I20260124 00:53:37.829757 1659 omaha_request_action.cc:271] Posting an Omaha request to disabled Jan 24 00:53:37.829899 update_engine[1659]: I20260124 00:53:37.829874 1659 omaha_request_action.cc:272] Request: Jan 24 00:53:37.829899 update_engine[1659]: Jan 24 00:53:37.829899 update_engine[1659]: Jan 24 00:53:37.829899 update_engine[1659]: Jan 24 00:53:37.829899 update_engine[1659]: Jan 24 00:53:37.829899 update_engine[1659]: Jan 24 00:53:37.829899 update_engine[1659]: Jan 24 00:53:37.829899 update_engine[1659]: Jan 24 00:53:37.829899 update_engine[1659]: Jan 24 00:53:37.831787 update_engine[1659]: I20260124 00:53:37.830111 1659 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 24 00:53:37.831831 locksmithd[1702]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Jan 24 00:53:37.834378 update_engine[1659]: I20260124 00:53:37.834339 1659 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 24 00:53:37.835445 update_engine[1659]: I20260124 00:53:37.835381 1659 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 24 00:53:37.836182 update_engine[1659]: E20260124 00:53:37.836141 1659 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Jan 24 00:53:37.836365 update_engine[1659]: I20260124 00:53:37.836340 1659 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Jan 24 00:53:39.788000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@72-65.109.167.77:22-4.153.228.146:38574 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:53:39.789829 systemd[1]: Started sshd@72-65.109.167.77:22-4.153.228.146:38574.service - OpenSSH per-connection server daemon (4.153.228.146:38574). Jan 24 00:53:39.793148 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:53:39.793248 kernel: audit: type=1130 audit(1769216019.788:1319): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@72-65.109.167.77:22-4.153.228.146:38574 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:53:40.477000 audit[6725]: USER_ACCT pid=6725 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:40.481424 sshd[6725]: Accepted publickey for core from 4.153.228.146 port 38574 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:53:40.488115 kernel: audit: type=1101 audit(1769216020.477:1320): pid=6725 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:40.488242 kernel: audit: type=1103 audit(1769216020.479:1321): pid=6725 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:40.479000 audit[6725]: CRED_ACQ pid=6725 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:40.487730 sshd-session[6725]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:53:40.494485 kernel: audit: type=1006 audit(1769216020.479:1322): pid=6725 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=73 res=1 Jan 24 00:53:40.479000 audit[6725]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc83aefd10 a2=3 a3=0 items=0 ppid=1 pid=6725 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=73 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:53:40.499558 systemd-logind[1656]: New session 73 of user core. Jan 24 00:53:40.502211 kernel: audit: type=1300 audit(1769216020.479:1322): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc83aefd10 a2=3 a3=0 items=0 ppid=1 pid=6725 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=73 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:53:40.479000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:53:40.506799 kernel: audit: type=1327 audit(1769216020.479:1322): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:53:40.510288 systemd[1]: Started session-73.scope - Session 73 of User core. Jan 24 00:53:40.512000 audit[6725]: USER_START pid=6725 uid=0 auid=500 ses=73 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:40.523097 kernel: audit: type=1105 audit(1769216020.512:1323): pid=6725 uid=0 auid=500 ses=73 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:40.516000 audit[6729]: CRED_ACQ pid=6729 uid=0 auid=500 ses=73 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:40.531097 kernel: audit: type=1103 audit(1769216020.516:1324): pid=6729 uid=0 auid=500 ses=73 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:40.618181 kubelet[2863]: E0124 00:53:40.617697 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" podUID="7fb26181-9fdc-4f96-be2c-85fbaa5f21b7" Jan 24 00:53:40.619189 kubelet[2863]: E0124 00:53:40.618443 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58c54c478-wd6fd" podUID="dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5" Jan 24 00:53:40.951099 sshd[6729]: Connection closed by 4.153.228.146 port 38574 Jan 24 00:53:40.951311 sshd-session[6725]: pam_unix(sshd:session): session closed for user core Jan 24 00:53:40.954000 audit[6725]: USER_END pid=6725 uid=0 auid=500 ses=73 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:40.971097 kernel: audit: type=1106 audit(1769216020.954:1325): pid=6725 uid=0 auid=500 ses=73 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:40.954000 audit[6725]: CRED_DISP pid=6725 uid=0 auid=500 ses=73 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:40.974422 systemd[1]: sshd@72-65.109.167.77:22-4.153.228.146:38574.service: Deactivated successfully. Jan 24 00:53:40.981846 systemd[1]: session-73.scope: Deactivated successfully. Jan 24 00:53:40.983943 kernel: audit: type=1104 audit(1769216020.954:1326): pid=6725 uid=0 auid=500 ses=73 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:40.983393 systemd-logind[1656]: Session 73 logged out. Waiting for processes to exit. Jan 24 00:53:40.973000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@72-65.109.167.77:22-4.153.228.146:38574 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:53:40.992394 systemd-logind[1656]: Removed session 73. Jan 24 00:53:43.618083 kubelet[2863]: E0124 00:53:43.617896 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qhldj" podUID="d0cc0ca8-4b85-478b-b9c2-c61b42d93c89" Jan 24 00:53:43.618083 kubelet[2863]: E0124 00:53:43.618042 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:53:44.616438 kubelet[2863]: E0124 00:53:44.616387 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:53:46.081000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@73-65.109.167.77:22-4.153.228.146:60396 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:53:46.082278 systemd[1]: Started sshd@73-65.109.167.77:22-4.153.228.146:60396.service - OpenSSH per-connection server daemon (4.153.228.146:60396). Jan 24 00:53:46.083399 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:53:46.083532 kernel: audit: type=1130 audit(1769216026.081:1328): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@73-65.109.167.77:22-4.153.228.146:60396 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:53:46.737000 audit[6741]: USER_ACCT pid=6741 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:46.740960 sshd[6741]: Accepted publickey for core from 4.153.228.146 port 60396 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:53:46.744208 sshd-session[6741]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:53:46.739000 audit[6741]: CRED_ACQ pid=6741 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:46.746961 kernel: audit: type=1101 audit(1769216026.737:1329): pid=6741 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:46.747131 kernel: audit: type=1103 audit(1769216026.739:1330): pid=6741 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:46.753346 kernel: audit: type=1006 audit(1769216026.739:1331): pid=6741 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=74 res=1 Jan 24 00:53:46.739000 audit[6741]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffda394d910 a2=3 a3=0 items=0 ppid=1 pid=6741 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=74 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:53:46.757854 kernel: audit: type=1300 audit(1769216026.739:1331): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffda394d910 a2=3 a3=0 items=0 ppid=1 pid=6741 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=74 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:53:46.760469 systemd-logind[1656]: New session 74 of user core. Jan 24 00:53:46.763020 kernel: audit: type=1327 audit(1769216026.739:1331): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:53:46.739000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:53:46.766243 systemd[1]: Started session-74.scope - Session 74 of User core. Jan 24 00:53:46.769000 audit[6741]: USER_START pid=6741 uid=0 auid=500 ses=74 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:46.778000 audit[6745]: CRED_ACQ pid=6745 uid=0 auid=500 ses=74 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:46.790767 kernel: audit: type=1105 audit(1769216026.769:1332): pid=6741 uid=0 auid=500 ses=74 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:46.790838 kernel: audit: type=1103 audit(1769216026.778:1333): pid=6745 uid=0 auid=500 ses=74 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:47.230101 sshd[6745]: Connection closed by 4.153.228.146 port 60396 Jan 24 00:53:47.232285 sshd-session[6741]: pam_unix(sshd:session): session closed for user core Jan 24 00:53:47.251473 kernel: audit: type=1106 audit(1769216027.234:1334): pid=6741 uid=0 auid=500 ses=74 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:47.234000 audit[6741]: USER_END pid=6741 uid=0 auid=500 ses=74 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:47.240839 systemd[1]: sshd@73-65.109.167.77:22-4.153.228.146:60396.service: Deactivated successfully. Jan 24 00:53:47.246223 systemd[1]: session-74.scope: Deactivated successfully. Jan 24 00:53:47.263382 kernel: audit: type=1104 audit(1769216027.234:1335): pid=6741 uid=0 auid=500 ses=74 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:47.234000 audit[6741]: CRED_DISP pid=6741 uid=0 auid=500 ses=74 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:47.262938 systemd-logind[1656]: Session 74 logged out. Waiting for processes to exit. Jan 24 00:53:47.239000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@73-65.109.167.77:22-4.153.228.146:60396 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:53:47.267033 systemd-logind[1656]: Removed session 74. Jan 24 00:53:47.809636 update_engine[1659]: I20260124 00:53:47.809172 1659 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 24 00:53:47.809636 update_engine[1659]: I20260124 00:53:47.809262 1659 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 24 00:53:47.809636 update_engine[1659]: I20260124 00:53:47.809573 1659 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 24 00:53:47.810316 update_engine[1659]: E20260124 00:53:47.810041 1659 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Jan 24 00:53:47.810830 update_engine[1659]: I20260124 00:53:47.810796 1659 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Jan 24 00:53:52.378089 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:53:52.378190 kernel: audit: type=1130 audit(1769216032.368:1337): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@74-65.109.167.77:22-4.153.228.146:60398 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:53:52.368000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@74-65.109.167.77:22-4.153.228.146:60398 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:53:52.369330 systemd[1]: Started sshd@74-65.109.167.77:22-4.153.228.146:60398.service - OpenSSH per-connection server daemon (4.153.228.146:60398). Jan 24 00:53:52.618204 kubelet[2863]: E0124 00:53:52.618163 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" podUID="d83d59e3-6296-40d9-bb63-5a69b654ac0c" Jan 24 00:53:52.619662 kubelet[2863]: E0124 00:53:52.619414 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58c54c478-wd6fd" podUID="dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5" Jan 24 00:53:53.030000 audit[6757]: USER_ACCT pid=6757 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:53.036009 sshd-session[6757]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:53:53.037846 sshd[6757]: Accepted publickey for core from 4.153.228.146 port 60398 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:53:53.031000 audit[6757]: CRED_ACQ pid=6757 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:53.049311 kernel: audit: type=1101 audit(1769216033.030:1338): pid=6757 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:53.049394 kernel: audit: type=1103 audit(1769216033.031:1339): pid=6757 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:53.050835 systemd-logind[1656]: New session 75 of user core. Jan 24 00:53:53.062152 kernel: audit: type=1006 audit(1769216033.031:1340): pid=6757 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=75 res=1 Jan 24 00:53:53.031000 audit[6757]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffcb24590b0 a2=3 a3=0 items=0 ppid=1 pid=6757 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=75 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:53:53.072178 kernel: audit: type=1300 audit(1769216033.031:1340): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffcb24590b0 a2=3 a3=0 items=0 ppid=1 pid=6757 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=75 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:53:53.073684 systemd[1]: Started session-75.scope - Session 75 of User core. Jan 24 00:53:53.086479 kernel: audit: type=1327 audit(1769216033.031:1340): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:53:53.031000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:53:53.088000 audit[6757]: USER_START pid=6757 uid=0 auid=500 ses=75 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:53.106102 kernel: audit: type=1105 audit(1769216033.088:1341): pid=6757 uid=0 auid=500 ses=75 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:53.106179 kernel: audit: type=1103 audit(1769216033.092:1342): pid=6761 uid=0 auid=500 ses=75 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:53.092000 audit[6761]: CRED_ACQ pid=6761 uid=0 auid=500 ses=75 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:53.512369 sshd[6761]: Connection closed by 4.153.228.146 port 60398 Jan 24 00:53:53.514728 sshd-session[6757]: pam_unix(sshd:session): session closed for user core Jan 24 00:53:53.519000 audit[6757]: USER_END pid=6757 uid=0 auid=500 ses=75 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:53.537177 kernel: audit: type=1106 audit(1769216033.519:1343): pid=6757 uid=0 auid=500 ses=75 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:53.527640 systemd[1]: sshd@74-65.109.167.77:22-4.153.228.146:60398.service: Deactivated successfully. Jan 24 00:53:53.534149 systemd[1]: session-75.scope: Deactivated successfully. Jan 24 00:53:53.537330 systemd-logind[1656]: Session 75 logged out. Waiting for processes to exit. Jan 24 00:53:53.519000 audit[6757]: CRED_DISP pid=6757 uid=0 auid=500 ses=75 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:53.544762 systemd-logind[1656]: Removed session 75. Jan 24 00:53:53.552103 kernel: audit: type=1104 audit(1769216033.519:1344): pid=6757 uid=0 auid=500 ses=75 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:53.527000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@74-65.109.167.77:22-4.153.228.146:60398 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:53:53.621580 kubelet[2863]: E0124 00:53:53.621547 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" podUID="7fb26181-9fdc-4f96-be2c-85fbaa5f21b7" Jan 24 00:53:55.617425 kubelet[2863]: E0124 00:53:55.617257 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qhldj" podUID="d0cc0ca8-4b85-478b-b9c2-c61b42d93c89" Jan 24 00:53:56.616398 kubelet[2863]: E0124 00:53:56.616323 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:53:57.618094 kubelet[2863]: E0124 00:53:57.617429 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:53:57.812458 update_engine[1659]: I20260124 00:53:57.812347 1659 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 24 00:53:57.813171 update_engine[1659]: I20260124 00:53:57.812474 1659 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 24 00:53:57.813408 update_engine[1659]: I20260124 00:53:57.813280 1659 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 24 00:53:57.813671 update_engine[1659]: E20260124 00:53:57.813593 1659 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Jan 24 00:53:57.813772 update_engine[1659]: I20260124 00:53:57.813727 1659 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Jan 24 00:53:58.654468 systemd[1]: Started sshd@75-65.109.167.77:22-4.153.228.146:48214.service - OpenSSH per-connection server daemon (4.153.228.146:48214). Jan 24 00:53:58.653000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@75-65.109.167.77:22-4.153.228.146:48214 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:53:58.656728 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:53:58.656775 kernel: audit: type=1130 audit(1769216038.653:1346): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@75-65.109.167.77:22-4.153.228.146:48214 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:53:59.337000 audit[6773]: USER_ACCT pid=6773 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:59.339158 sshd[6773]: Accepted publickey for core from 4.153.228.146 port 48214 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:53:59.346605 sshd-session[6773]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:53:59.353592 kernel: audit: type=1101 audit(1769216039.337:1347): pid=6773 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:59.341000 audit[6773]: CRED_ACQ pid=6773 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:59.370141 kernel: audit: type=1103 audit(1769216039.341:1348): pid=6773 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:59.381883 systemd-logind[1656]: New session 76 of user core. Jan 24 00:53:59.390124 kernel: audit: type=1006 audit(1769216039.341:1349): pid=6773 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=76 res=1 Jan 24 00:53:59.390323 kernel: audit: type=1300 audit(1769216039.341:1349): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffcaada9540 a2=3 a3=0 items=0 ppid=1 pid=6773 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=76 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:53:59.341000 audit[6773]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffcaada9540 a2=3 a3=0 items=0 ppid=1 pid=6773 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=76 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:53:59.394241 systemd[1]: Started session-76.scope - Session 76 of User core. Jan 24 00:53:59.341000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:53:59.408000 audit[6773]: USER_START pid=6773 uid=0 auid=500 ses=76 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:59.415647 kernel: audit: type=1327 audit(1769216039.341:1349): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:53:59.417243 kernel: audit: type=1105 audit(1769216039.408:1350): pid=6773 uid=0 auid=500 ses=76 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:59.419000 audit[6779]: CRED_ACQ pid=6779 uid=0 auid=500 ses=76 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:59.437132 kernel: audit: type=1103 audit(1769216039.419:1351): pid=6779 uid=0 auid=500 ses=76 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:59.855921 sshd[6779]: Connection closed by 4.153.228.146 port 48214 Jan 24 00:53:59.856870 sshd-session[6773]: pam_unix(sshd:session): session closed for user core Jan 24 00:53:59.859000 audit[6773]: USER_END pid=6773 uid=0 auid=500 ses=76 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:59.865287 systemd-logind[1656]: Session 76 logged out. Waiting for processes to exit. Jan 24 00:53:59.866452 systemd[1]: sshd@75-65.109.167.77:22-4.153.228.146:48214.service: Deactivated successfully. Jan 24 00:53:59.874509 systemd[1]: session-76.scope: Deactivated successfully. Jan 24 00:53:59.877137 kernel: audit: type=1106 audit(1769216039.859:1352): pid=6773 uid=0 auid=500 ses=76 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:59.878009 kernel: audit: type=1104 audit(1769216039.859:1353): pid=6773 uid=0 auid=500 ses=76 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:59.859000 audit[6773]: CRED_DISP pid=6773 uid=0 auid=500 ses=76 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:53:59.884435 systemd-logind[1656]: Removed session 76. Jan 24 00:53:59.866000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@75-65.109.167.77:22-4.153.228.146:48214 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:54:03.620558 kubelet[2863]: E0124 00:54:03.620509 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" podUID="d83d59e3-6296-40d9-bb63-5a69b654ac0c" Jan 24 00:54:03.622146 kubelet[2863]: E0124 00:54:03.621842 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58c54c478-wd6fd" podUID="dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5" Jan 24 00:54:04.994493 systemd[1]: Started sshd@76-65.109.167.77:22-4.153.228.146:36752.service - OpenSSH per-connection server daemon (4.153.228.146:36752). Jan 24 00:54:04.999647 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:54:04.999754 kernel: audit: type=1130 audit(1769216044.993:1355): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@76-65.109.167.77:22-4.153.228.146:36752 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:54:04.993000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@76-65.109.167.77:22-4.153.228.146:36752 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:54:05.699000 audit[6814]: USER_ACCT pid=6814 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:05.706233 sshd-session[6814]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:54:05.709305 sshd[6814]: Accepted publickey for core from 4.153.228.146 port 36752 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:54:05.699000 audit[6814]: CRED_ACQ pid=6814 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:05.718909 kernel: audit: type=1101 audit(1769216045.699:1356): pid=6814 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:05.719018 kernel: audit: type=1103 audit(1769216045.699:1357): pid=6814 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:05.723633 systemd-logind[1656]: New session 77 of user core. Jan 24 00:54:05.730380 kernel: audit: type=1006 audit(1769216045.699:1358): pid=6814 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=77 res=1 Jan 24 00:54:05.699000 audit[6814]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fffe1230cd0 a2=3 a3=0 items=0 ppid=1 pid=6814 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=77 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:54:05.741642 kernel: audit: type=1300 audit(1769216045.699:1358): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fffe1230cd0 a2=3 a3=0 items=0 ppid=1 pid=6814 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=77 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:54:05.699000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:54:05.752763 kernel: audit: type=1327 audit(1769216045.699:1358): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:54:05.753369 systemd[1]: Started session-77.scope - Session 77 of User core. Jan 24 00:54:05.760000 audit[6814]: USER_START pid=6814 uid=0 auid=500 ses=77 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:05.763000 audit[6818]: CRED_ACQ pid=6818 uid=0 auid=500 ses=77 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:05.776631 kernel: audit: type=1105 audit(1769216045.760:1359): pid=6814 uid=0 auid=500 ses=77 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:05.776786 kernel: audit: type=1103 audit(1769216045.763:1360): pid=6818 uid=0 auid=500 ses=77 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:06.210533 sshd[6818]: Connection closed by 4.153.228.146 port 36752 Jan 24 00:54:06.212348 sshd-session[6814]: pam_unix(sshd:session): session closed for user core Jan 24 00:54:06.214000 audit[6814]: USER_END pid=6814 uid=0 auid=500 ses=77 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:06.233242 kernel: audit: type=1106 audit(1769216046.214:1361): pid=6814 uid=0 auid=500 ses=77 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:06.222401 systemd[1]: sshd@76-65.109.167.77:22-4.153.228.146:36752.service: Deactivated successfully. Jan 24 00:54:06.214000 audit[6814]: CRED_DISP pid=6814 uid=0 auid=500 ses=77 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:06.227996 systemd[1]: session-77.scope: Deactivated successfully. Jan 24 00:54:06.231405 systemd-logind[1656]: Session 77 logged out. Waiting for processes to exit. Jan 24 00:54:06.234475 systemd-logind[1656]: Removed session 77. Jan 24 00:54:06.249094 kernel: audit: type=1104 audit(1769216046.214:1362): pid=6814 uid=0 auid=500 ses=77 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:06.218000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@76-65.109.167.77:22-4.153.228.146:36752 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:54:07.616819 kubelet[2863]: E0124 00:54:07.616753 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" podUID="7fb26181-9fdc-4f96-be2c-85fbaa5f21b7" Jan 24 00:54:07.810854 update_engine[1659]: I20260124 00:54:07.810741 1659 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 24 00:54:07.810854 update_engine[1659]: I20260124 00:54:07.810854 1659 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 24 00:54:07.811807 update_engine[1659]: I20260124 00:54:07.811719 1659 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 24 00:54:07.812093 update_engine[1659]: E20260124 00:54:07.812016 1659 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Jan 24 00:54:07.812197 update_engine[1659]: I20260124 00:54:07.812170 1659 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Jan 24 00:54:07.812197 update_engine[1659]: I20260124 00:54:07.812188 1659 omaha_request_action.cc:617] Omaha request response: Jan 24 00:54:07.812352 update_engine[1659]: E20260124 00:54:07.812309 1659 omaha_request_action.cc:636] Omaha request network transfer failed. Jan 24 00:54:07.812409 update_engine[1659]: I20260124 00:54:07.812347 1659 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Jan 24 00:54:07.812409 update_engine[1659]: I20260124 00:54:07.812361 1659 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Jan 24 00:54:07.812409 update_engine[1659]: I20260124 00:54:07.812374 1659 update_attempter.cc:306] Processing Done. Jan 24 00:54:07.812409 update_engine[1659]: E20260124 00:54:07.812397 1659 update_attempter.cc:619] Update failed. Jan 24 00:54:07.812796 update_engine[1659]: I20260124 00:54:07.812412 1659 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Jan 24 00:54:07.812796 update_engine[1659]: I20260124 00:54:07.812425 1659 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Jan 24 00:54:07.812796 update_engine[1659]: I20260124 00:54:07.812438 1659 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Jan 24 00:54:07.812796 update_engine[1659]: I20260124 00:54:07.812547 1659 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Jan 24 00:54:07.812796 update_engine[1659]: I20260124 00:54:07.812577 1659 omaha_request_action.cc:271] Posting an Omaha request to disabled Jan 24 00:54:07.812796 update_engine[1659]: I20260124 00:54:07.812590 1659 omaha_request_action.cc:272] Request: Jan 24 00:54:07.812796 update_engine[1659]: Jan 24 00:54:07.812796 update_engine[1659]: Jan 24 00:54:07.812796 update_engine[1659]: Jan 24 00:54:07.812796 update_engine[1659]: Jan 24 00:54:07.812796 update_engine[1659]: Jan 24 00:54:07.812796 update_engine[1659]: Jan 24 00:54:07.812796 update_engine[1659]: I20260124 00:54:07.812604 1659 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 24 00:54:07.812796 update_engine[1659]: I20260124 00:54:07.812642 1659 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 24 00:54:07.813415 update_engine[1659]: I20260124 00:54:07.813154 1659 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 24 00:54:07.813771 update_engine[1659]: E20260124 00:54:07.813452 1659 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Jan 24 00:54:07.813771 update_engine[1659]: I20260124 00:54:07.813536 1659 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Jan 24 00:54:07.813771 update_engine[1659]: I20260124 00:54:07.813550 1659 omaha_request_action.cc:617] Omaha request response: Jan 24 00:54:07.813771 update_engine[1659]: I20260124 00:54:07.813565 1659 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Jan 24 00:54:07.813771 update_engine[1659]: I20260124 00:54:07.813582 1659 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Jan 24 00:54:07.813771 update_engine[1659]: I20260124 00:54:07.813596 1659 update_attempter.cc:306] Processing Done. Jan 24 00:54:07.813771 update_engine[1659]: I20260124 00:54:07.813611 1659 update_attempter.cc:310] Error event sent. Jan 24 00:54:07.813771 update_engine[1659]: I20260124 00:54:07.813632 1659 update_check_scheduler.cc:74] Next update check in 43m48s Jan 24 00:54:07.814231 locksmithd[1702]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Jan 24 00:54:07.814231 locksmithd[1702]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Jan 24 00:54:09.621814 kubelet[2863]: E0124 00:54:09.621734 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:54:09.623585 kubelet[2863]: E0124 00:54:09.623191 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qhldj" podUID="d0cc0ca8-4b85-478b-b9c2-c61b42d93c89" Jan 24 00:54:11.345285 systemd[1]: Started sshd@77-65.109.167.77:22-4.153.228.146:36768.service - OpenSSH per-connection server daemon (4.153.228.146:36768). Jan 24 00:54:11.344000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@77-65.109.167.77:22-4.153.228.146:36768 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:54:11.346738 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:54:11.346803 kernel: audit: type=1130 audit(1769216051.344:1364): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@77-65.109.167.77:22-4.153.228.146:36768 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:54:11.999000 audit[6844]: USER_ACCT pid=6844 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:12.004203 sshd-session[6844]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:54:12.006446 sshd[6844]: Accepted publickey for core from 4.153.228.146 port 36768 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:54:12.017757 kernel: audit: type=1101 audit(1769216051.999:1365): pid=6844 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:12.017864 kernel: audit: type=1103 audit(1769216052.001:1366): pid=6844 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:12.001000 audit[6844]: CRED_ACQ pid=6844 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:12.017113 systemd-logind[1656]: New session 78 of user core. Jan 24 00:54:12.034140 kernel: audit: type=1006 audit(1769216052.001:1367): pid=6844 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=78 res=1 Jan 24 00:54:12.001000 audit[6844]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe5eb125a0 a2=3 a3=0 items=0 ppid=1 pid=6844 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=78 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:54:12.057118 kernel: audit: type=1300 audit(1769216052.001:1367): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe5eb125a0 a2=3 a3=0 items=0 ppid=1 pid=6844 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=78 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:54:12.001000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:54:12.063601 systemd[1]: Started session-78.scope - Session 78 of User core. Jan 24 00:54:12.064216 kernel: audit: type=1327 audit(1769216052.001:1367): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:54:12.069000 audit[6844]: USER_START pid=6844 uid=0 auid=500 ses=78 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:12.079000 audit[6848]: CRED_ACQ pid=6848 uid=0 auid=500 ses=78 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:12.089434 kernel: audit: type=1105 audit(1769216052.069:1368): pid=6844 uid=0 auid=500 ses=78 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:12.089538 kernel: audit: type=1103 audit(1769216052.079:1369): pid=6848 uid=0 auid=500 ses=78 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:12.499432 sshd[6848]: Connection closed by 4.153.228.146 port 36768 Jan 24 00:54:12.500445 sshd-session[6844]: pam_unix(sshd:session): session closed for user core Jan 24 00:54:12.503000 audit[6844]: USER_END pid=6844 uid=0 auid=500 ses=78 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:12.511487 systemd[1]: sshd@77-65.109.167.77:22-4.153.228.146:36768.service: Deactivated successfully. Jan 24 00:54:12.519216 systemd[1]: session-78.scope: Deactivated successfully. Jan 24 00:54:12.523188 kernel: audit: type=1106 audit(1769216052.503:1370): pid=6844 uid=0 auid=500 ses=78 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:12.523280 kernel: audit: type=1104 audit(1769216052.503:1371): pid=6844 uid=0 auid=500 ses=78 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:12.503000 audit[6844]: CRED_DISP pid=6844 uid=0 auid=500 ses=78 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:12.525676 systemd-logind[1656]: Session 78 logged out. Waiting for processes to exit. Jan 24 00:54:12.532776 systemd-logind[1656]: Removed session 78. Jan 24 00:54:12.507000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@77-65.109.167.77:22-4.153.228.146:36768 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:54:12.616942 kubelet[2863]: E0124 00:54:12.616861 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:54:16.618300 kubelet[2863]: E0124 00:54:16.618039 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58c54c478-wd6fd" podUID="dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5" Jan 24 00:54:17.658191 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:54:17.658423 kernel: audit: type=1130 audit(1769216057.645:1373): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@78-65.109.167.77:22-4.153.228.146:59172 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:54:17.645000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@78-65.109.167.77:22-4.153.228.146:59172 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:54:17.646665 systemd[1]: Started sshd@78-65.109.167.77:22-4.153.228.146:59172.service - OpenSSH per-connection server daemon (4.153.228.146:59172). Jan 24 00:54:18.348000 audit[6867]: USER_ACCT pid=6867 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:18.366138 kernel: audit: type=1101 audit(1769216058.348:1374): pid=6867 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:18.367696 sshd[6867]: Accepted publickey for core from 4.153.228.146 port 59172 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:54:18.370330 sshd-session[6867]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:54:18.366000 audit[6867]: CRED_ACQ pid=6867 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:18.385096 kernel: audit: type=1103 audit(1769216058.366:1375): pid=6867 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:18.367000 audit[6867]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc0762c3e0 a2=3 a3=0 items=0 ppid=1 pid=6867 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=79 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:54:18.397413 kernel: audit: type=1006 audit(1769216058.367:1376): pid=6867 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=79 res=1 Jan 24 00:54:18.397494 kernel: audit: type=1300 audit(1769216058.367:1376): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc0762c3e0 a2=3 a3=0 items=0 ppid=1 pid=6867 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=79 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:54:18.404962 systemd-logind[1656]: New session 79 of user core. Jan 24 00:54:18.367000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:54:18.410082 kernel: audit: type=1327 audit(1769216058.367:1376): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:54:18.411446 systemd[1]: Started session-79.scope - Session 79 of User core. Jan 24 00:54:18.416000 audit[6867]: USER_START pid=6867 uid=0 auid=500 ses=79 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:18.420000 audit[6871]: CRED_ACQ pid=6871 uid=0 auid=500 ses=79 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:18.428519 kernel: audit: type=1105 audit(1769216058.416:1377): pid=6867 uid=0 auid=500 ses=79 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:18.428605 kernel: audit: type=1103 audit(1769216058.420:1378): pid=6871 uid=0 auid=500 ses=79 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:18.616796 kubelet[2863]: E0124 00:54:18.616457 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" podUID="d83d59e3-6296-40d9-bb63-5a69b654ac0c" Jan 24 00:54:18.828872 sshd[6871]: Connection closed by 4.153.228.146 port 59172 Jan 24 00:54:18.829840 sshd-session[6867]: pam_unix(sshd:session): session closed for user core Jan 24 00:54:18.829000 audit[6867]: USER_END pid=6867 uid=0 auid=500 ses=79 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:18.833358 systemd-logind[1656]: Session 79 logged out. Waiting for processes to exit. Jan 24 00:54:18.835604 systemd[1]: sshd@78-65.109.167.77:22-4.153.228.146:59172.service: Deactivated successfully. Jan 24 00:54:18.838089 systemd[1]: session-79.scope: Deactivated successfully. Jan 24 00:54:18.829000 audit[6867]: CRED_DISP pid=6867 uid=0 auid=500 ses=79 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:18.840921 kernel: audit: type=1106 audit(1769216058.829:1379): pid=6867 uid=0 auid=500 ses=79 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:18.840967 kernel: audit: type=1104 audit(1769216058.829:1380): pid=6867 uid=0 auid=500 ses=79 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:18.840557 systemd-logind[1656]: Removed session 79. Jan 24 00:54:18.832000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@78-65.109.167.77:22-4.153.228.146:59172 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:54:20.616685 kubelet[2863]: E0124 00:54:20.616526 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" podUID="7fb26181-9fdc-4f96-be2c-85fbaa5f21b7" Jan 24 00:54:21.618720 kubelet[2863]: E0124 00:54:21.618276 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:54:23.619117 kubelet[2863]: E0124 00:54:23.618880 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qhldj" podUID="d0cc0ca8-4b85-478b-b9c2-c61b42d93c89" Jan 24 00:54:23.620829 kubelet[2863]: E0124 00:54:23.620543 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:54:23.969115 systemd[1]: Started sshd@79-65.109.167.77:22-4.153.228.146:59186.service - OpenSSH per-connection server daemon (4.153.228.146:59186). Jan 24 00:54:23.971253 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:54:23.971313 kernel: audit: type=1130 audit(1769216063.968:1382): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@79-65.109.167.77:22-4.153.228.146:59186 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:54:23.968000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@79-65.109.167.77:22-4.153.228.146:59186 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:54:24.677695 sshd[6885]: Accepted publickey for core from 4.153.228.146 port 59186 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:54:24.676000 audit[6885]: USER_ACCT pid=6885 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:24.682897 sshd-session[6885]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:54:24.690157 kernel: audit: type=1101 audit(1769216064.676:1383): pid=6885 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:24.691428 kernel: audit: type=1103 audit(1769216064.679:1384): pid=6885 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:24.679000 audit[6885]: CRED_ACQ pid=6885 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:24.705103 kernel: audit: type=1006 audit(1769216064.679:1385): pid=6885 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=80 res=1 Jan 24 00:54:24.711115 kernel: audit: type=1300 audit(1769216064.679:1385): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc8c585de0 a2=3 a3=0 items=0 ppid=1 pid=6885 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=80 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:54:24.679000 audit[6885]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc8c585de0 a2=3 a3=0 items=0 ppid=1 pid=6885 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=80 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:54:24.707191 systemd-logind[1656]: New session 80 of user core. Jan 24 00:54:24.720207 kernel: audit: type=1327 audit(1769216064.679:1385): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:54:24.679000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:54:24.723372 systemd[1]: Started session-80.scope - Session 80 of User core. Jan 24 00:54:24.734000 audit[6885]: USER_START pid=6885 uid=0 auid=500 ses=80 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:24.752111 kernel: audit: type=1105 audit(1769216064.734:1386): pid=6885 uid=0 auid=500 ses=80 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:24.758200 kernel: audit: type=1103 audit(1769216064.751:1387): pid=6889 uid=0 auid=500 ses=80 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:24.751000 audit[6889]: CRED_ACQ pid=6889 uid=0 auid=500 ses=80 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:25.122024 sshd[6889]: Connection closed by 4.153.228.146 port 59186 Jan 24 00:54:25.123049 sshd-session[6885]: pam_unix(sshd:session): session closed for user core Jan 24 00:54:25.139526 kernel: audit: type=1106 audit(1769216065.127:1388): pid=6885 uid=0 auid=500 ses=80 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:25.127000 audit[6885]: USER_END pid=6885 uid=0 auid=500 ses=80 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:25.136207 systemd[1]: sshd@79-65.109.167.77:22-4.153.228.146:59186.service: Deactivated successfully. Jan 24 00:54:25.127000 audit[6885]: CRED_DISP pid=6885 uid=0 auid=500 ses=80 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:25.138114 systemd[1]: session-80.scope: Deactivated successfully. Jan 24 00:54:25.143655 systemd-logind[1656]: Session 80 logged out. Waiting for processes to exit. Jan 24 00:54:25.144651 systemd-logind[1656]: Removed session 80. Jan 24 00:54:25.135000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@79-65.109.167.77:22-4.153.228.146:59186 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:54:25.153422 kernel: audit: type=1104 audit(1769216065.127:1389): pid=6885 uid=0 auid=500 ses=80 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:30.260973 systemd[1]: Started sshd@80-65.109.167.77:22-4.153.228.146:36132.service - OpenSSH per-connection server daemon (4.153.228.146:36132). Jan 24 00:54:30.262694 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:54:30.262753 kernel: audit: type=1130 audit(1769216070.260:1391): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@80-65.109.167.77:22-4.153.228.146:36132 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:54:30.260000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@80-65.109.167.77:22-4.153.228.146:36132 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:54:30.958000 audit[6904]: USER_ACCT pid=6904 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:30.960201 sshd[6904]: Accepted publickey for core from 4.153.228.146 port 36132 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:54:30.963864 sshd-session[6904]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:54:30.961000 audit[6904]: CRED_ACQ pid=6904 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:30.967650 kernel: audit: type=1101 audit(1769216070.958:1392): pid=6904 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:30.967722 kernel: audit: type=1103 audit(1769216070.961:1393): pid=6904 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:30.973161 kernel: audit: type=1006 audit(1769216070.961:1394): pid=6904 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=81 res=1 Jan 24 00:54:30.961000 audit[6904]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe0f6459a0 a2=3 a3=0 items=0 ppid=1 pid=6904 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=81 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:54:30.977570 kernel: audit: type=1300 audit(1769216070.961:1394): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe0f6459a0 a2=3 a3=0 items=0 ppid=1 pid=6904 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=81 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:54:30.978943 systemd-logind[1656]: New session 81 of user core. Jan 24 00:54:30.982676 kernel: audit: type=1327 audit(1769216070.961:1394): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:54:30.961000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:54:30.986278 systemd[1]: Started session-81.scope - Session 81 of User core. Jan 24 00:54:30.994000 audit[6904]: USER_START pid=6904 uid=0 auid=500 ses=81 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:30.999000 audit[6908]: CRED_ACQ pid=6908 uid=0 auid=500 ses=81 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:31.005315 kernel: audit: type=1105 audit(1769216070.994:1395): pid=6904 uid=0 auid=500 ses=81 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:31.005401 kernel: audit: type=1103 audit(1769216070.999:1396): pid=6908 uid=0 auid=500 ses=81 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:31.373257 sshd[6908]: Connection closed by 4.153.228.146 port 36132 Jan 24 00:54:31.374649 sshd-session[6904]: pam_unix(sshd:session): session closed for user core Jan 24 00:54:31.377000 audit[6904]: USER_END pid=6904 uid=0 auid=500 ses=81 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:31.389443 systemd[1]: sshd@80-65.109.167.77:22-4.153.228.146:36132.service: Deactivated successfully. Jan 24 00:54:31.398152 kernel: audit: type=1106 audit(1769216071.377:1397): pid=6904 uid=0 auid=500 ses=81 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:31.396330 systemd[1]: session-81.scope: Deactivated successfully. Jan 24 00:54:31.377000 audit[6904]: CRED_DISP pid=6904 uid=0 auid=500 ses=81 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:31.404186 systemd-logind[1656]: Session 81 logged out. Waiting for processes to exit. Jan 24 00:54:31.406398 systemd-logind[1656]: Removed session 81. Jan 24 00:54:31.387000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@80-65.109.167.77:22-4.153.228.146:36132 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:54:31.414137 kernel: audit: type=1104 audit(1769216071.377:1398): pid=6904 uid=0 auid=500 ses=81 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:31.510000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@81-65.109.167.77:22-4.153.228.146:36148 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:54:31.510806 systemd[1]: Started sshd@81-65.109.167.77:22-4.153.228.146:36148.service - OpenSSH per-connection server daemon (4.153.228.146:36148). Jan 24 00:54:31.620956 kubelet[2863]: E0124 00:54:31.620310 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58c54c478-wd6fd" podUID="dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5" Jan 24 00:54:32.192970 sshd[6920]: Accepted publickey for core from 4.153.228.146 port 36148 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:54:32.191000 audit[6920]: USER_ACCT pid=6920 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:32.193000 audit[6920]: CRED_ACQ pid=6920 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:32.194000 audit[6920]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd8c2b05b0 a2=3 a3=0 items=0 ppid=1 pid=6920 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=82 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:54:32.194000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:54:32.197460 sshd-session[6920]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:54:32.212043 systemd-logind[1656]: New session 82 of user core. Jan 24 00:54:32.218355 systemd[1]: Started session-82.scope - Session 82 of User core. Jan 24 00:54:32.227000 audit[6920]: USER_START pid=6920 uid=0 auid=500 ses=82 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:32.231000 audit[6924]: CRED_ACQ pid=6924 uid=0 auid=500 ses=82 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:32.616084 kubelet[2863]: E0124 00:54:32.615987 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" podUID="7fb26181-9fdc-4f96-be2c-85fbaa5f21b7" Jan 24 00:54:32.734784 sshd[6924]: Connection closed by 4.153.228.146 port 36148 Jan 24 00:54:32.736200 sshd-session[6920]: pam_unix(sshd:session): session closed for user core Jan 24 00:54:32.736000 audit[6920]: USER_END pid=6920 uid=0 auid=500 ses=82 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:32.737000 audit[6920]: CRED_DISP pid=6920 uid=0 auid=500 ses=82 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:32.740185 systemd-logind[1656]: Session 82 logged out. Waiting for processes to exit. Jan 24 00:54:32.741991 systemd[1]: sshd@81-65.109.167.77:22-4.153.228.146:36148.service: Deactivated successfully. Jan 24 00:54:32.741000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@81-65.109.167.77:22-4.153.228.146:36148 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:54:32.744659 systemd[1]: session-82.scope: Deactivated successfully. Jan 24 00:54:32.746788 systemd-logind[1656]: Removed session 82. Jan 24 00:54:32.863551 systemd[1]: Started sshd@82-65.109.167.77:22-4.153.228.146:36152.service - OpenSSH per-connection server daemon (4.153.228.146:36152). Jan 24 00:54:32.862000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@82-65.109.167.77:22-4.153.228.146:36152 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:54:33.516000 audit[6968]: USER_ACCT pid=6968 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:33.518009 sshd[6968]: Accepted publickey for core from 4.153.228.146 port 36152 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:54:33.519000 audit[6968]: CRED_ACQ pid=6968 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:33.519000 audit[6968]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd0c80ccc0 a2=3 a3=0 items=0 ppid=1 pid=6968 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=83 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:54:33.519000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:54:33.523946 sshd-session[6968]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:54:33.538245 systemd-logind[1656]: New session 83 of user core. Jan 24 00:54:33.545326 systemd[1]: Started session-83.scope - Session 83 of User core. Jan 24 00:54:33.550000 audit[6968]: USER_START pid=6968 uid=0 auid=500 ses=83 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:33.554000 audit[6972]: CRED_ACQ pid=6972 uid=0 auid=500 ses=83 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:33.621794 kubelet[2863]: E0124 00:54:33.620471 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" podUID="d83d59e3-6296-40d9-bb63-5a69b654ac0c" Jan 24 00:54:33.621794 kubelet[2863]: E0124 00:54:33.620813 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:54:34.430000 audit[6982]: NETFILTER_CFG table=filter:137 family=2 entries=26 op=nft_register_rule pid=6982 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 24 00:54:34.430000 audit[6982]: SYSCALL arch=c000003e syscall=46 success=yes exit=14176 a0=3 a1=7ffc9f28bb90 a2=0 a3=7ffc9f28bb7c items=0 ppid=2972 pid=6982 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:54:34.430000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 24 00:54:34.435000 audit[6982]: NETFILTER_CFG table=nat:138 family=2 entries=20 op=nft_register_rule pid=6982 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 24 00:54:34.435000 audit[6982]: SYSCALL arch=c000003e syscall=46 success=yes exit=5772 a0=3 a1=7ffc9f28bb90 a2=0 a3=0 items=0 ppid=2972 pid=6982 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:54:34.435000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 24 00:54:34.456000 audit[6984]: NETFILTER_CFG table=filter:139 family=2 entries=38 op=nft_register_rule pid=6984 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 24 00:54:34.456000 audit[6984]: SYSCALL arch=c000003e syscall=46 success=yes exit=14176 a0=3 a1=7ffca6efafc0 a2=0 a3=7ffca6efafac items=0 ppid=2972 pid=6984 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:54:34.456000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 24 00:54:34.465000 audit[6984]: NETFILTER_CFG table=nat:140 family=2 entries=20 op=nft_register_rule pid=6984 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 24 00:54:34.465000 audit[6984]: SYSCALL arch=c000003e syscall=46 success=yes exit=5772 a0=3 a1=7ffca6efafc0 a2=0 a3=0 items=0 ppid=2972 pid=6984 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:54:34.465000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 24 00:54:34.550611 sshd[6972]: Connection closed by 4.153.228.146 port 36152 Jan 24 00:54:34.551257 sshd-session[6968]: pam_unix(sshd:session): session closed for user core Jan 24 00:54:34.551000 audit[6968]: USER_END pid=6968 uid=0 auid=500 ses=83 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:34.551000 audit[6968]: CRED_DISP pid=6968 uid=0 auid=500 ses=83 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:34.557510 systemd[1]: sshd@82-65.109.167.77:22-4.153.228.146:36152.service: Deactivated successfully. Jan 24 00:54:34.557000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@82-65.109.167.77:22-4.153.228.146:36152 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:54:34.562368 systemd[1]: session-83.scope: Deactivated successfully. Jan 24 00:54:34.564922 systemd-logind[1656]: Session 83 logged out. Waiting for processes to exit. Jan 24 00:54:34.566208 systemd-logind[1656]: Removed session 83. Jan 24 00:54:34.688641 systemd[1]: Started sshd@83-65.109.167.77:22-4.153.228.146:36634.service - OpenSSH per-connection server daemon (4.153.228.146:36634). Jan 24 00:54:34.687000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@83-65.109.167.77:22-4.153.228.146:36634 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:54:35.376995 kernel: kauditd_printk_skb: 36 callbacks suppressed Jan 24 00:54:35.377114 kernel: audit: type=1101 audit(1769216075.372:1423): pid=6989 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:35.372000 audit[6989]: USER_ACCT pid=6989 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:35.375656 sshd-session[6989]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:54:35.377431 sshd[6989]: Accepted publickey for core from 4.153.228.146 port 36634 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:54:35.372000 audit[6989]: CRED_ACQ pid=6989 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:35.407229 kernel: audit: type=1103 audit(1769216075.372:1424): pid=6989 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:35.407305 kernel: audit: type=1006 audit(1769216075.372:1425): pid=6989 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=84 res=1 Jan 24 00:54:35.407336 kernel: audit: type=1300 audit(1769216075.372:1425): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff0a5e4aa0 a2=3 a3=0 items=0 ppid=1 pid=6989 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=84 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:54:35.372000 audit[6989]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff0a5e4aa0 a2=3 a3=0 items=0 ppid=1 pid=6989 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=84 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:54:35.372000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:54:35.418766 systemd-logind[1656]: New session 84 of user core. Jan 24 00:54:35.420089 kernel: audit: type=1327 audit(1769216075.372:1425): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:54:35.423213 systemd[1]: Started session-84.scope - Session 84 of User core. Jan 24 00:54:35.426000 audit[6989]: USER_START pid=6989 uid=0 auid=500 ses=84 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:35.441235 kernel: audit: type=1105 audit(1769216075.426:1426): pid=6989 uid=0 auid=500 ses=84 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:35.441333 kernel: audit: type=1103 audit(1769216075.426:1427): pid=6993 uid=0 auid=500 ses=84 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:35.426000 audit[6993]: CRED_ACQ pid=6993 uid=0 auid=500 ses=84 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:35.906222 sshd[6993]: Connection closed by 4.153.228.146 port 36634 Jan 24 00:54:35.908222 sshd-session[6989]: pam_unix(sshd:session): session closed for user core Jan 24 00:54:35.908000 audit[6989]: USER_END pid=6989 uid=0 auid=500 ses=84 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:35.916007 systemd[1]: sshd@83-65.109.167.77:22-4.153.228.146:36634.service: Deactivated successfully. Jan 24 00:54:35.916465 systemd-logind[1656]: Session 84 logged out. Waiting for processes to exit. Jan 24 00:54:35.919434 kernel: audit: type=1106 audit(1769216075.908:1428): pid=6989 uid=0 auid=500 ses=84 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:35.920639 systemd[1]: session-84.scope: Deactivated successfully. Jan 24 00:54:35.908000 audit[6989]: CRED_DISP pid=6989 uid=0 auid=500 ses=84 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:35.926165 systemd-logind[1656]: Removed session 84. Jan 24 00:54:35.928142 kernel: audit: type=1104 audit(1769216075.908:1429): pid=6989 uid=0 auid=500 ses=84 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:35.916000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@83-65.109.167.77:22-4.153.228.146:36634 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:54:35.935162 kernel: audit: type=1131 audit(1769216075.916:1430): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@83-65.109.167.77:22-4.153.228.146:36634 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:54:36.039590 systemd[1]: Started sshd@84-65.109.167.77:22-4.153.228.146:36636.service - OpenSSH per-connection server daemon (4.153.228.146:36636). Jan 24 00:54:36.038000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@84-65.109.167.77:22-4.153.228.146:36636 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:54:36.725000 audit[7003]: USER_ACCT pid=7003 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:36.727444 sshd[7003]: Accepted publickey for core from 4.153.228.146 port 36636 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:54:36.728000 audit[7003]: CRED_ACQ pid=7003 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:36.729000 audit[7003]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd43c2ba10 a2=3 a3=0 items=0 ppid=1 pid=7003 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=85 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:54:36.729000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:54:36.732846 sshd-session[7003]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:54:36.742433 systemd-logind[1656]: New session 85 of user core. Jan 24 00:54:36.757366 systemd[1]: Started session-85.scope - Session 85 of User core. Jan 24 00:54:36.761000 audit[7003]: USER_START pid=7003 uid=0 auid=500 ses=85 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:36.766000 audit[7007]: CRED_ACQ pid=7007 uid=0 auid=500 ses=85 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:37.165906 sshd[7007]: Connection closed by 4.153.228.146 port 36636 Jan 24 00:54:37.166738 sshd-session[7003]: pam_unix(sshd:session): session closed for user core Jan 24 00:54:37.167000 audit[7003]: USER_END pid=7003 uid=0 auid=500 ses=85 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:37.167000 audit[7003]: CRED_DISP pid=7003 uid=0 auid=500 ses=85 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:37.171000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@84-65.109.167.77:22-4.153.228.146:36636 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:54:37.171967 systemd[1]: sshd@84-65.109.167.77:22-4.153.228.146:36636.service: Deactivated successfully. Jan 24 00:54:37.175053 systemd[1]: session-85.scope: Deactivated successfully. Jan 24 00:54:37.178041 systemd-logind[1656]: Session 85 logged out. Waiting for processes to exit. Jan 24 00:54:37.180787 systemd-logind[1656]: Removed session 85. Jan 24 00:54:37.620901 kubelet[2863]: E0124 00:54:37.620833 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qhldj" podUID="d0cc0ca8-4b85-478b-b9c2-c61b42d93c89" Jan 24 00:54:38.616735 kubelet[2863]: E0124 00:54:38.616678 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:54:42.319515 kernel: kauditd_printk_skb: 11 callbacks suppressed Jan 24 00:54:42.319649 kernel: audit: type=1130 audit(1769216082.303:1440): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@85-65.109.167.77:22-4.153.228.146:36638 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:54:42.303000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@85-65.109.167.77:22-4.153.228.146:36638 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:54:42.304584 systemd[1]: Started sshd@85-65.109.167.77:22-4.153.228.146:36638.service - OpenSSH per-connection server daemon (4.153.228.146:36638). Jan 24 00:54:42.996000 audit[7019]: USER_ACCT pid=7019 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:43.008517 sshd-session[7019]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:54:43.010996 sshd[7019]: Accepted publickey for core from 4.153.228.146 port 36638 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:54:43.015245 kernel: audit: type=1101 audit(1769216082.996:1441): pid=7019 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:43.033796 kernel: audit: type=1103 audit(1769216083.003:1442): pid=7019 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:43.003000 audit[7019]: CRED_ACQ pid=7019 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:43.003000 audit[7019]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff6860c8b0 a2=3 a3=0 items=0 ppid=1 pid=7019 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=86 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:54:43.051538 kernel: audit: type=1006 audit(1769216083.003:1443): pid=7019 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=86 res=1 Jan 24 00:54:43.052492 kernel: audit: type=1300 audit(1769216083.003:1443): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff6860c8b0 a2=3 a3=0 items=0 ppid=1 pid=7019 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=86 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:54:43.057277 systemd-logind[1656]: New session 86 of user core. Jan 24 00:54:43.003000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:54:43.068117 kernel: audit: type=1327 audit(1769216083.003:1443): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:54:43.070329 systemd[1]: Started session-86.scope - Session 86 of User core. Jan 24 00:54:43.077000 audit[7019]: USER_START pid=7019 uid=0 auid=500 ses=86 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:43.080000 audit[7023]: CRED_ACQ pid=7023 uid=0 auid=500 ses=86 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:43.094233 kernel: audit: type=1105 audit(1769216083.077:1444): pid=7019 uid=0 auid=500 ses=86 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:43.094320 kernel: audit: type=1103 audit(1769216083.080:1445): pid=7023 uid=0 auid=500 ses=86 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:43.453683 sshd[7023]: Connection closed by 4.153.228.146 port 36638 Jan 24 00:54:43.456399 sshd-session[7019]: pam_unix(sshd:session): session closed for user core Jan 24 00:54:43.456000 audit[7019]: USER_END pid=7019 uid=0 auid=500 ses=86 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:43.462134 systemd-logind[1656]: Session 86 logged out. Waiting for processes to exit. Jan 24 00:54:43.463495 systemd[1]: sshd@85-65.109.167.77:22-4.153.228.146:36638.service: Deactivated successfully. Jan 24 00:54:43.467074 kernel: audit: type=1106 audit(1769216083.456:1446): pid=7019 uid=0 auid=500 ses=86 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:43.466705 systemd[1]: session-86.scope: Deactivated successfully. Jan 24 00:54:43.457000 audit[7019]: CRED_DISP pid=7019 uid=0 auid=500 ses=86 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:43.470831 systemd-logind[1656]: Removed session 86. Jan 24 00:54:43.476075 kernel: audit: type=1104 audit(1769216083.457:1447): pid=7019 uid=0 auid=500 ses=86 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:43.459000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@85-65.109.167.77:22-4.153.228.146:36638 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:54:44.616841 kubelet[2863]: E0124 00:54:44.616746 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:54:44.617705 kubelet[2863]: E0124 00:54:44.617344 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" podUID="d83d59e3-6296-40d9-bb63-5a69b654ac0c" Jan 24 00:54:45.615864 kubelet[2863]: E0124 00:54:45.615819 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" podUID="7fb26181-9fdc-4f96-be2c-85fbaa5f21b7" Jan 24 00:54:45.617348 kubelet[2863]: E0124 00:54:45.617299 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58c54c478-wd6fd" podUID="dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5" Jan 24 00:54:48.587933 systemd[1]: Started sshd@86-65.109.167.77:22-4.153.228.146:53274.service - OpenSSH per-connection server daemon (4.153.228.146:53274). Jan 24 00:54:48.598417 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:54:48.598488 kernel: audit: type=1130 audit(1769216088.587:1449): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@86-65.109.167.77:22-4.153.228.146:53274 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:54:48.587000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@86-65.109.167.77:22-4.153.228.146:53274 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:54:49.263000 audit[7036]: USER_ACCT pid=7036 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:49.277102 sshd[7036]: Accepted publickey for core from 4.153.228.146 port 53274 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:54:49.279116 kernel: audit: type=1101 audit(1769216089.263:1450): pid=7036 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:49.279248 kernel: audit: type=1103 audit(1769216089.274:1451): pid=7036 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:49.274000 audit[7036]: CRED_ACQ pid=7036 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:49.281506 sshd-session[7036]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:54:49.303107 kernel: audit: type=1006 audit(1769216089.274:1452): pid=7036 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=87 res=1 Jan 24 00:54:49.303396 kernel: audit: type=1300 audit(1769216089.274:1452): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fffe98ca6b0 a2=3 a3=0 items=0 ppid=1 pid=7036 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=87 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:54:49.274000 audit[7036]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fffe98ca6b0 a2=3 a3=0 items=0 ppid=1 pid=7036 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=87 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:54:49.274000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:54:49.322579 kernel: audit: type=1327 audit(1769216089.274:1452): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:54:49.319641 systemd-logind[1656]: New session 87 of user core. Jan 24 00:54:49.331384 systemd[1]: Started session-87.scope - Session 87 of User core. Jan 24 00:54:49.343000 audit[7036]: USER_START pid=7036 uid=0 auid=500 ses=87 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:49.347000 audit[7041]: CRED_ACQ pid=7041 uid=0 auid=500 ses=87 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:49.360596 kernel: audit: type=1105 audit(1769216089.343:1453): pid=7036 uid=0 auid=500 ses=87 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:49.360730 kernel: audit: type=1103 audit(1769216089.347:1454): pid=7041 uid=0 auid=500 ses=87 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:49.760100 sshd[7041]: Connection closed by 4.153.228.146 port 53274 Jan 24 00:54:49.760399 sshd-session[7036]: pam_unix(sshd:session): session closed for user core Jan 24 00:54:49.771878 kernel: audit: type=1106 audit(1769216089.761:1455): pid=7036 uid=0 auid=500 ses=87 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:49.771994 kernel: audit: type=1104 audit(1769216089.761:1456): pid=7036 uid=0 auid=500 ses=87 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:49.761000 audit[7036]: USER_END pid=7036 uid=0 auid=500 ses=87 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:49.761000 audit[7036]: CRED_DISP pid=7036 uid=0 auid=500 ses=87 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:49.766299 systemd[1]: sshd@86-65.109.167.77:22-4.153.228.146:53274.service: Deactivated successfully. Jan 24 00:54:49.768556 systemd[1]: session-87.scope: Deactivated successfully. Jan 24 00:54:49.769492 systemd-logind[1656]: Session 87 logged out. Waiting for processes to exit. Jan 24 00:54:49.771498 systemd-logind[1656]: Removed session 87. Jan 24 00:54:49.765000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@86-65.109.167.77:22-4.153.228.146:53274 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:54:50.617578 kubelet[2863]: E0124 00:54:50.617504 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:54:52.618217 kubelet[2863]: E0124 00:54:52.618035 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qhldj" podUID="d0cc0ca8-4b85-478b-b9c2-c61b42d93c89" Jan 24 00:54:54.894406 systemd[1]: Started sshd@87-65.109.167.77:22-4.153.228.146:33332.service - OpenSSH per-connection server daemon (4.153.228.146:33332). Jan 24 00:54:54.897638 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:54:54.897666 kernel: audit: type=1130 audit(1769216094.893:1458): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@87-65.109.167.77:22-4.153.228.146:33332 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:54:54.893000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@87-65.109.167.77:22-4.153.228.146:33332 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:54:55.558844 sshd[7053]: Accepted publickey for core from 4.153.228.146 port 33332 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:54:55.557000 audit[7053]: USER_ACCT pid=7053 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:55.567000 audit[7053]: CRED_ACQ pid=7053 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:55.574333 kernel: audit: type=1101 audit(1769216095.557:1459): pid=7053 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:55.574396 kernel: audit: type=1103 audit(1769216095.567:1460): pid=7053 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:55.573951 sshd-session[7053]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:54:55.584566 kernel: audit: type=1006 audit(1769216095.567:1461): pid=7053 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=88 res=1 Jan 24 00:54:55.567000 audit[7053]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fffc8112eb0 a2=3 a3=0 items=0 ppid=1 pid=7053 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=88 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:54:55.598103 kernel: audit: type=1300 audit(1769216095.567:1461): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fffc8112eb0 a2=3 a3=0 items=0 ppid=1 pid=7053 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=88 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:54:55.598495 systemd-logind[1656]: New session 88 of user core. Jan 24 00:54:55.567000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:54:55.603206 kernel: audit: type=1327 audit(1769216095.567:1461): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:54:55.604736 systemd[1]: Started session-88.scope - Session 88 of User core. Jan 24 00:54:55.610000 audit[7053]: USER_START pid=7053 uid=0 auid=500 ses=88 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:55.619164 kernel: audit: type=1105 audit(1769216095.610:1462): pid=7053 uid=0 auid=500 ses=88 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:55.620000 audit[7059]: CRED_ACQ pid=7059 uid=0 auid=500 ses=88 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:55.629190 kernel: audit: type=1103 audit(1769216095.620:1463): pid=7059 uid=0 auid=500 ses=88 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:56.007099 sshd[7059]: Connection closed by 4.153.228.146 port 33332 Jan 24 00:54:56.009229 sshd-session[7053]: pam_unix(sshd:session): session closed for user core Jan 24 00:54:56.009000 audit[7053]: USER_END pid=7053 uid=0 auid=500 ses=88 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:56.019086 kernel: audit: type=1106 audit(1769216096.009:1464): pid=7053 uid=0 auid=500 ses=88 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:56.009000 audit[7053]: CRED_DISP pid=7053 uid=0 auid=500 ses=88 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:56.021764 systemd[1]: sshd@87-65.109.167.77:22-4.153.228.146:33332.service: Deactivated successfully. Jan 24 00:54:56.021000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@87-65.109.167.77:22-4.153.228.146:33332 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:54:56.025101 kernel: audit: type=1104 audit(1769216096.009:1465): pid=7053 uid=0 auid=500 ses=88 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:54:56.025539 systemd[1]: session-88.scope: Deactivated successfully. Jan 24 00:54:56.027056 systemd-logind[1656]: Session 88 logged out. Waiting for processes to exit. Jan 24 00:54:56.028672 systemd-logind[1656]: Removed session 88. Jan 24 00:54:59.617171 kubelet[2863]: E0124 00:54:59.616968 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:54:59.618558 kubelet[2863]: E0124 00:54:59.618204 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" podUID="d83d59e3-6296-40d9-bb63-5a69b654ac0c" Jan 24 00:54:59.620078 kubelet[2863]: E0124 00:54:59.619989 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58c54c478-wd6fd" podUID="dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5" Jan 24 00:55:00.617895 kubelet[2863]: E0124 00:55:00.617828 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" podUID="7fb26181-9fdc-4f96-be2c-85fbaa5f21b7" Jan 24 00:55:01.147359 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:55:01.147463 kernel: audit: type=1130 audit(1769216101.138:1467): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@88-65.109.167.77:22-4.153.228.146:33344 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:55:01.138000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@88-65.109.167.77:22-4.153.228.146:33344 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:55:01.139465 systemd[1]: Started sshd@88-65.109.167.77:22-4.153.228.146:33344.service - OpenSSH per-connection server daemon (4.153.228.146:33344). Jan 24 00:55:01.616707 kubelet[2863]: E0124 00:55:01.616629 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:55:01.804000 audit[7075]: USER_ACCT pid=7075 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:01.808284 sshd[7075]: Accepted publickey for core from 4.153.228.146 port 33344 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:55:01.810746 sshd-session[7075]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:55:01.821147 kernel: audit: type=1101 audit(1769216101.804:1468): pid=7075 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:01.807000 audit[7075]: CRED_ACQ pid=7075 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:01.833110 systemd-logind[1656]: New session 89 of user core. Jan 24 00:55:01.842528 kernel: audit: type=1103 audit(1769216101.807:1469): pid=7075 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:01.842730 kernel: audit: type=1006 audit(1769216101.808:1470): pid=7075 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=89 res=1 Jan 24 00:55:01.808000 audit[7075]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffcc3cf0090 a2=3 a3=0 items=0 ppid=1 pid=7075 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=89 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:55:01.849370 systemd[1]: Started session-89.scope - Session 89 of User core. Jan 24 00:55:01.853357 kernel: audit: type=1300 audit(1769216101.808:1470): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffcc3cf0090 a2=3 a3=0 items=0 ppid=1 pid=7075 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=89 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:55:01.808000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:55:01.863220 kernel: audit: type=1327 audit(1769216101.808:1470): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:55:01.857000 audit[7075]: USER_START pid=7075 uid=0 auid=500 ses=89 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:01.869795 kernel: audit: type=1105 audit(1769216101.857:1471): pid=7075 uid=0 auid=500 ses=89 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:01.861000 audit[7079]: CRED_ACQ pid=7079 uid=0 auid=500 ses=89 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:01.880305 kernel: audit: type=1103 audit(1769216101.861:1472): pid=7079 uid=0 auid=500 ses=89 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:02.276613 sshd[7079]: Connection closed by 4.153.228.146 port 33344 Jan 24 00:55:02.278029 sshd-session[7075]: pam_unix(sshd:session): session closed for user core Jan 24 00:55:02.278000 audit[7075]: USER_END pid=7075 uid=0 auid=500 ses=89 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:02.282459 systemd[1]: sshd@88-65.109.167.77:22-4.153.228.146:33344.service: Deactivated successfully. Jan 24 00:55:02.284493 systemd[1]: session-89.scope: Deactivated successfully. Jan 24 00:55:02.287138 systemd-logind[1656]: Session 89 logged out. Waiting for processes to exit. Jan 24 00:55:02.287970 systemd-logind[1656]: Removed session 89. Jan 24 00:55:02.278000 audit[7075]: CRED_DISP pid=7075 uid=0 auid=500 ses=89 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:02.299431 kernel: audit: type=1106 audit(1769216102.278:1473): pid=7075 uid=0 auid=500 ses=89 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:02.299526 kernel: audit: type=1104 audit(1769216102.278:1474): pid=7075 uid=0 auid=500 ses=89 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:02.278000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@88-65.109.167.77:22-4.153.228.146:33344 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:55:06.616904 kubelet[2863]: E0124 00:55:06.616812 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qhldj" podUID="d0cc0ca8-4b85-478b-b9c2-c61b42d93c89" Jan 24 00:55:07.423443 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:55:07.423555 kernel: audit: type=1130 audit(1769216107.419:1476): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@89-65.109.167.77:22-4.153.228.146:59714 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:55:07.419000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@89-65.109.167.77:22-4.153.228.146:59714 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:55:07.420518 systemd[1]: Started sshd@89-65.109.167.77:22-4.153.228.146:59714.service - OpenSSH per-connection server daemon (4.153.228.146:59714). Jan 24 00:55:08.088000 audit[7116]: USER_ACCT pid=7116 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:08.090231 sshd[7116]: Accepted publickey for core from 4.153.228.146 port 59714 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:55:08.090000 audit[7116]: CRED_ACQ pid=7116 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:08.097920 sshd-session[7116]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:55:08.098974 kernel: audit: type=1101 audit(1769216108.088:1477): pid=7116 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:08.099048 kernel: audit: type=1103 audit(1769216108.090:1478): pid=7116 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:08.105564 kernel: audit: type=1006 audit(1769216108.090:1479): pid=7116 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=90 res=1 Jan 24 00:55:08.090000 audit[7116]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffdc65fa1e0 a2=3 a3=0 items=0 ppid=1 pid=7116 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=90 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:55:08.111132 kernel: audit: type=1300 audit(1769216108.090:1479): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffdc65fa1e0 a2=3 a3=0 items=0 ppid=1 pid=7116 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=90 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:55:08.116474 kernel: audit: type=1327 audit(1769216108.090:1479): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:55:08.090000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:55:08.112717 systemd-logind[1656]: New session 90 of user core. Jan 24 00:55:08.119388 systemd[1]: Started session-90.scope - Session 90 of User core. Jan 24 00:55:08.126000 audit[7116]: USER_START pid=7116 uid=0 auid=500 ses=90 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:08.136547 kernel: audit: type=1105 audit(1769216108.126:1480): pid=7116 uid=0 auid=500 ses=90 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:08.136687 kernel: audit: type=1103 audit(1769216108.130:1481): pid=7120 uid=0 auid=500 ses=90 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:08.130000 audit[7120]: CRED_ACQ pid=7120 uid=0 auid=500 ses=90 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:08.590122 sshd[7120]: Connection closed by 4.153.228.146 port 59714 Jan 24 00:55:08.591275 sshd-session[7116]: pam_unix(sshd:session): session closed for user core Jan 24 00:55:08.597000 audit[7116]: USER_END pid=7116 uid=0 auid=500 ses=90 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:08.617291 kernel: audit: type=1106 audit(1769216108.597:1482): pid=7116 uid=0 auid=500 ses=90 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:08.617488 kernel: audit: type=1104 audit(1769216108.597:1483): pid=7116 uid=0 auid=500 ses=90 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:08.597000 audit[7116]: CRED_DISP pid=7116 uid=0 auid=500 ses=90 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:08.633191 systemd[1]: sshd@89-65.109.167.77:22-4.153.228.146:59714.service: Deactivated successfully. Jan 24 00:55:08.636330 systemd[1]: session-90.scope: Deactivated successfully. Jan 24 00:55:08.641758 systemd-logind[1656]: Session 90 logged out. Waiting for processes to exit. Jan 24 00:55:08.643656 systemd-logind[1656]: Removed session 90. Jan 24 00:55:08.632000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@89-65.109.167.77:22-4.153.228.146:59714 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:55:10.618404 kubelet[2863]: E0124 00:55:10.618154 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" podUID="d83d59e3-6296-40d9-bb63-5a69b654ac0c" Jan 24 00:55:11.620790 kubelet[2863]: E0124 00:55:11.620722 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:55:12.616452 kubelet[2863]: E0124 00:55:12.616400 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" podUID="7fb26181-9fdc-4f96-be2c-85fbaa5f21b7" Jan 24 00:55:13.748730 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:55:13.748874 kernel: audit: type=1130 audit(1769216113.731:1485): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@90-65.109.167.77:22-4.153.228.146:59716 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:55:13.731000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@90-65.109.167.77:22-4.153.228.146:59716 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:55:13.731549 systemd[1]: Started sshd@90-65.109.167.77:22-4.153.228.146:59716.service - OpenSSH per-connection server daemon (4.153.228.146:59716). Jan 24 00:55:14.408000 audit[7132]: USER_ACCT pid=7132 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:14.414396 sshd-session[7132]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:55:14.415870 sshd[7132]: Accepted publickey for core from 4.153.228.146 port 59716 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:55:14.425984 kernel: audit: type=1101 audit(1769216114.408:1486): pid=7132 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:14.426165 kernel: audit: type=1103 audit(1769216114.408:1487): pid=7132 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:14.408000 audit[7132]: CRED_ACQ pid=7132 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:14.447174 kernel: audit: type=1006 audit(1769216114.408:1488): pid=7132 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=91 res=1 Jan 24 00:55:14.408000 audit[7132]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc0350a5d0 a2=3 a3=0 items=0 ppid=1 pid=7132 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=91 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:55:14.452510 systemd-logind[1656]: New session 91 of user core. Jan 24 00:55:14.408000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:55:14.464175 kernel: audit: type=1300 audit(1769216114.408:1488): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc0350a5d0 a2=3 a3=0 items=0 ppid=1 pid=7132 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=91 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:55:14.464269 kernel: audit: type=1327 audit(1769216114.408:1488): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:55:14.469834 systemd[1]: Started session-91.scope - Session 91 of User core. Jan 24 00:55:14.480000 audit[7132]: USER_START pid=7132 uid=0 auid=500 ses=91 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:14.495000 audit[7136]: CRED_ACQ pid=7136 uid=0 auid=500 ses=91 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:14.501628 kernel: audit: type=1105 audit(1769216114.480:1489): pid=7132 uid=0 auid=500 ses=91 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:14.501715 kernel: audit: type=1103 audit(1769216114.495:1490): pid=7136 uid=0 auid=500 ses=91 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:14.617927 kubelet[2863]: E0124 00:55:14.617837 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58c54c478-wd6fd" podUID="dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5" Jan 24 00:55:14.904001 sshd[7136]: Connection closed by 4.153.228.146 port 59716 Jan 24 00:55:14.902996 sshd-session[7132]: pam_unix(sshd:session): session closed for user core Jan 24 00:55:14.904000 audit[7132]: USER_END pid=7132 uid=0 auid=500 ses=91 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:14.916775 systemd[1]: sshd@90-65.109.167.77:22-4.153.228.146:59716.service: Deactivated successfully. Jan 24 00:55:14.920860 systemd-logind[1656]: Session 91 logged out. Waiting for processes to exit. Jan 24 00:55:14.924006 kernel: audit: type=1106 audit(1769216114.904:1491): pid=7132 uid=0 auid=500 ses=91 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:14.924055 kernel: audit: type=1104 audit(1769216114.904:1492): pid=7132 uid=0 auid=500 ses=91 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:14.904000 audit[7132]: CRED_DISP pid=7132 uid=0 auid=500 ses=91 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:14.925298 systemd[1]: session-91.scope: Deactivated successfully. Jan 24 00:55:14.935156 systemd-logind[1656]: Removed session 91. Jan 24 00:55:14.916000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@90-65.109.167.77:22-4.153.228.146:59716 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:55:15.617449 kubelet[2863]: E0124 00:55:15.617409 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:55:20.040243 systemd[1]: Started sshd@91-65.109.167.77:22-4.153.228.146:50336.service - OpenSSH per-connection server daemon (4.153.228.146:50336). Jan 24 00:55:20.059035 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:55:20.059275 kernel: audit: type=1130 audit(1769216120.039:1494): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@91-65.109.167.77:22-4.153.228.146:50336 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:55:20.039000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@91-65.109.167.77:22-4.153.228.146:50336 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:55:20.616530 kubelet[2863]: E0124 00:55:20.616419 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qhldj" podUID="d0cc0ca8-4b85-478b-b9c2-c61b42d93c89" Jan 24 00:55:20.768163 kernel: audit: type=1101 audit(1769216120.751:1495): pid=7148 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:20.751000 audit[7148]: USER_ACCT pid=7148 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:20.758120 sshd-session[7148]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:55:20.769041 sshd[7148]: Accepted publickey for core from 4.153.228.146 port 50336 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:55:20.755000 audit[7148]: CRED_ACQ pid=7148 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:20.778095 kernel: audit: type=1103 audit(1769216120.755:1496): pid=7148 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:20.784566 systemd-logind[1656]: New session 92 of user core. Jan 24 00:55:20.790084 kernel: audit: type=1006 audit(1769216120.755:1497): pid=7148 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=92 res=1 Jan 24 00:55:20.791205 systemd[1]: Started session-92.scope - Session 92 of User core. Jan 24 00:55:20.755000 audit[7148]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc9a2dc2b0 a2=3 a3=0 items=0 ppid=1 pid=7148 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=92 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:55:20.801086 kernel: audit: type=1300 audit(1769216120.755:1497): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc9a2dc2b0 a2=3 a3=0 items=0 ppid=1 pid=7148 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=92 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:55:20.755000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:55:20.801000 audit[7148]: USER_START pid=7148 uid=0 auid=500 ses=92 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:20.813186 kernel: audit: type=1327 audit(1769216120.755:1497): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:55:20.813238 kernel: audit: type=1105 audit(1769216120.801:1498): pid=7148 uid=0 auid=500 ses=92 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:20.802000 audit[7152]: CRED_ACQ pid=7152 uid=0 auid=500 ses=92 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:20.828083 kernel: audit: type=1103 audit(1769216120.802:1499): pid=7152 uid=0 auid=500 ses=92 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:21.240634 sshd[7152]: Connection closed by 4.153.228.146 port 50336 Jan 24 00:55:21.241880 sshd-session[7148]: pam_unix(sshd:session): session closed for user core Jan 24 00:55:21.244000 audit[7148]: USER_END pid=7148 uid=0 auid=500 ses=92 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:21.251727 systemd[1]: sshd@91-65.109.167.77:22-4.153.228.146:50336.service: Deactivated successfully. Jan 24 00:55:21.262724 kernel: audit: type=1106 audit(1769216121.244:1500): pid=7148 uid=0 auid=500 ses=92 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:21.256105 systemd[1]: session-92.scope: Deactivated successfully. Jan 24 00:55:21.262183 systemd-logind[1656]: Session 92 logged out. Waiting for processes to exit. Jan 24 00:55:21.265648 systemd-logind[1656]: Removed session 92. Jan 24 00:55:21.244000 audit[7148]: CRED_DISP pid=7148 uid=0 auid=500 ses=92 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:21.248000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@91-65.109.167.77:22-4.153.228.146:50336 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:55:21.280126 kernel: audit: type=1104 audit(1769216121.244:1501): pid=7148 uid=0 auid=500 ses=92 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:21.620736 kubelet[2863]: E0124 00:55:21.620292 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" podUID="d83d59e3-6296-40d9-bb63-5a69b654ac0c" Jan 24 00:55:25.617016 kubelet[2863]: E0124 00:55:25.616738 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:55:25.618106 kubelet[2863]: E0124 00:55:25.617864 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58c54c478-wd6fd" podUID="dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5" Jan 24 00:55:26.387702 systemd[1]: Started sshd@92-65.109.167.77:22-4.153.228.146:57076.service - OpenSSH per-connection server daemon (4.153.228.146:57076). Jan 24 00:55:26.390799 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:55:26.390973 kernel: audit: type=1130 audit(1769216126.386:1503): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@92-65.109.167.77:22-4.153.228.146:57076 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:55:26.386000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@92-65.109.167.77:22-4.153.228.146:57076 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:55:27.108000 audit[7166]: USER_ACCT pid=7166 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:27.108000 audit[7166]: CRED_ACQ pid=7166 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:27.115485 sshd-session[7166]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:55:27.126121 sshd[7166]: Accepted publickey for core from 4.153.228.146 port 57076 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:55:27.128267 kernel: audit: type=1101 audit(1769216127.108:1504): pid=7166 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:27.128404 kernel: audit: type=1103 audit(1769216127.108:1505): pid=7166 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:27.151881 kernel: audit: type=1006 audit(1769216127.108:1506): pid=7166 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=93 res=1 Jan 24 00:55:27.152012 kernel: audit: type=1300 audit(1769216127.108:1506): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd24bab9b0 a2=3 a3=0 items=0 ppid=1 pid=7166 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=93 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:55:27.108000 audit[7166]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd24bab9b0 a2=3 a3=0 items=0 ppid=1 pid=7166 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=93 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:55:27.149192 systemd-logind[1656]: New session 93 of user core. Jan 24 00:55:27.108000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:55:27.156150 kernel: audit: type=1327 audit(1769216127.108:1506): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:55:27.160413 systemd[1]: Started session-93.scope - Session 93 of User core. Jan 24 00:55:27.169000 audit[7166]: USER_START pid=7166 uid=0 auid=500 ses=93 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:27.178150 kernel: audit: type=1105 audit(1769216127.169:1507): pid=7166 uid=0 auid=500 ses=93 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:27.176000 audit[7170]: CRED_ACQ pid=7170 uid=0 auid=500 ses=93 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:27.184119 kernel: audit: type=1103 audit(1769216127.176:1508): pid=7170 uid=0 auid=500 ses=93 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:27.590921 sshd[7170]: Connection closed by 4.153.228.146 port 57076 Jan 24 00:55:27.592407 sshd-session[7166]: pam_unix(sshd:session): session closed for user core Jan 24 00:55:27.594000 audit[7166]: USER_END pid=7166 uid=0 auid=500 ses=93 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:27.601820 systemd[1]: sshd@92-65.109.167.77:22-4.153.228.146:57076.service: Deactivated successfully. Jan 24 00:55:27.606143 systemd[1]: session-93.scope: Deactivated successfully. Jan 24 00:55:27.612035 systemd-logind[1656]: Session 93 logged out. Waiting for processes to exit. Jan 24 00:55:27.614568 systemd-logind[1656]: Removed session 93. Jan 24 00:55:27.615146 kernel: audit: type=1106 audit(1769216127.594:1509): pid=7166 uid=0 auid=500 ses=93 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:27.620440 kubelet[2863]: E0124 00:55:27.620385 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" podUID="7fb26181-9fdc-4f96-be2c-85fbaa5f21b7" Jan 24 00:55:27.594000 audit[7166]: CRED_DISP pid=7166 uid=0 auid=500 ses=93 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:27.625176 kubelet[2863]: E0124 00:55:27.625001 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:55:27.601000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@92-65.109.167.77:22-4.153.228.146:57076 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:55:27.634302 kernel: audit: type=1104 audit(1769216127.594:1510): pid=7166 uid=0 auid=500 ses=93 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:32.616797 kubelet[2863]: E0124 00:55:32.616669 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qhldj" podUID="d0cc0ca8-4b85-478b-b9c2-c61b42d93c89" Jan 24 00:55:32.734639 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:55:32.735218 kernel: audit: type=1130 audit(1769216132.731:1512): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@93-65.109.167.77:22-4.153.228.146:57092 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:55:32.731000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@93-65.109.167.77:22-4.153.228.146:57092 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:55:32.732651 systemd[1]: Started sshd@93-65.109.167.77:22-4.153.228.146:57092.service - OpenSSH per-connection server daemon (4.153.228.146:57092). Jan 24 00:55:33.439735 sshd[7185]: Accepted publickey for core from 4.153.228.146 port 57092 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:55:33.438000 audit[7185]: USER_ACCT pid=7185 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:33.453138 sshd-session[7185]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:55:33.457117 kernel: audit: type=1101 audit(1769216133.438:1513): pid=7185 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:33.449000 audit[7185]: CRED_ACQ pid=7185 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:33.471149 kernel: audit: type=1103 audit(1769216133.449:1514): pid=7185 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:33.480711 kernel: audit: type=1006 audit(1769216133.449:1515): pid=7185 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=94 res=1 Jan 24 00:55:33.480797 kernel: audit: type=1300 audit(1769216133.449:1515): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc7f090e10 a2=3 a3=0 items=0 ppid=1 pid=7185 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=94 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:55:33.449000 audit[7185]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc7f090e10 a2=3 a3=0 items=0 ppid=1 pid=7185 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=94 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:55:33.487025 kernel: audit: type=1327 audit(1769216133.449:1515): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:55:33.449000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:55:33.482768 systemd-logind[1656]: New session 94 of user core. Jan 24 00:55:33.491266 systemd[1]: Started session-94.scope - Session 94 of User core. Jan 24 00:55:33.498000 audit[7185]: USER_START pid=7185 uid=0 auid=500 ses=94 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:33.507651 kernel: audit: type=1105 audit(1769216133.498:1516): pid=7185 uid=0 auid=500 ses=94 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:33.507707 kernel: audit: type=1103 audit(1769216133.506:1517): pid=7214 uid=0 auid=500 ses=94 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:33.506000 audit[7214]: CRED_ACQ pid=7214 uid=0 auid=500 ses=94 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:33.622381 kubelet[2863]: E0124 00:55:33.622336 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" podUID="d83d59e3-6296-40d9-bb63-5a69b654ac0c" Jan 24 00:55:33.914883 sshd[7214]: Connection closed by 4.153.228.146 port 57092 Jan 24 00:55:33.915718 sshd-session[7185]: pam_unix(sshd:session): session closed for user core Jan 24 00:55:33.917000 audit[7185]: USER_END pid=7185 uid=0 auid=500 ses=94 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:33.936774 kernel: audit: type=1106 audit(1769216133.917:1518): pid=7185 uid=0 auid=500 ses=94 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:33.926199 systemd[1]: sshd@93-65.109.167.77:22-4.153.228.146:57092.service: Deactivated successfully. Jan 24 00:55:33.950516 kernel: audit: type=1104 audit(1769216133.918:1519): pid=7185 uid=0 auid=500 ses=94 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:33.918000 audit[7185]: CRED_DISP pid=7185 uid=0 auid=500 ses=94 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:33.938501 systemd[1]: session-94.scope: Deactivated successfully. Jan 24 00:55:33.945652 systemd-logind[1656]: Session 94 logged out. Waiting for processes to exit. Jan 24 00:55:33.927000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@93-65.109.167.77:22-4.153.228.146:57092 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:55:33.952821 systemd-logind[1656]: Removed session 94. Jan 24 00:55:38.616842 kubelet[2863]: E0124 00:55:38.616752 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:55:39.074506 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:55:39.074687 kernel: audit: type=1130 audit(1769216139.057:1521): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@94-65.109.167.77:22-4.153.228.146:46180 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:55:39.057000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@94-65.109.167.77:22-4.153.228.146:46180 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:55:39.058691 systemd[1]: Started sshd@94-65.109.167.77:22-4.153.228.146:46180.service - OpenSSH per-connection server daemon (4.153.228.146:46180). Jan 24 00:55:39.617135 kubelet[2863]: E0124 00:55:39.617086 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58c54c478-wd6fd" podUID="dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5" Jan 24 00:55:39.757000 audit[7226]: USER_ACCT pid=7226 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:39.760480 sshd-session[7226]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:55:39.763718 sshd[7226]: Accepted publickey for core from 4.153.228.146 port 46180 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:55:39.765709 kernel: audit: type=1101 audit(1769216139.757:1522): pid=7226 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:39.765770 kernel: audit: type=1103 audit(1769216139.758:1523): pid=7226 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:39.758000 audit[7226]: CRED_ACQ pid=7226 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:39.770107 systemd-logind[1656]: New session 95 of user core. Jan 24 00:55:39.775509 systemd[1]: Started session-95.scope - Session 95 of User core. Jan 24 00:55:39.777100 kernel: audit: type=1006 audit(1769216139.758:1524): pid=7226 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=95 res=1 Jan 24 00:55:39.758000 audit[7226]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffcdb7af0b0 a2=3 a3=0 items=0 ppid=1 pid=7226 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=95 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:55:39.784537 kernel: audit: type=1300 audit(1769216139.758:1524): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffcdb7af0b0 a2=3 a3=0 items=0 ppid=1 pid=7226 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=95 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:55:39.784593 kernel: audit: type=1327 audit(1769216139.758:1524): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:55:39.758000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:55:39.784000 audit[7226]: USER_START pid=7226 uid=0 auid=500 ses=95 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:39.789170 kernel: audit: type=1105 audit(1769216139.784:1525): pid=7226 uid=0 auid=500 ses=95 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:39.788000 audit[7230]: CRED_ACQ pid=7230 uid=0 auid=500 ses=95 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:39.802086 kernel: audit: type=1103 audit(1769216139.788:1526): pid=7230 uid=0 auid=500 ses=95 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:40.239438 sshd[7230]: Connection closed by 4.153.228.146 port 46180 Jan 24 00:55:40.241422 sshd-session[7226]: pam_unix(sshd:session): session closed for user core Jan 24 00:55:40.263198 kernel: audit: type=1106 audit(1769216140.244:1527): pid=7226 uid=0 auid=500 ses=95 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:40.244000 audit[7226]: USER_END pid=7226 uid=0 auid=500 ses=95 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:40.254284 systemd[1]: sshd@94-65.109.167.77:22-4.153.228.146:46180.service: Deactivated successfully. Jan 24 00:55:40.259811 systemd[1]: session-95.scope: Deactivated successfully. Jan 24 00:55:40.244000 audit[7226]: CRED_DISP pid=7226 uid=0 auid=500 ses=95 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:40.273613 systemd-logind[1656]: Session 95 logged out. Waiting for processes to exit. Jan 24 00:55:40.253000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@94-65.109.167.77:22-4.153.228.146:46180 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:55:40.282153 kernel: audit: type=1104 audit(1769216140.244:1528): pid=7226 uid=0 auid=500 ses=95 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:40.283520 systemd-logind[1656]: Removed session 95. Jan 24 00:55:40.615910 kubelet[2863]: E0124 00:55:40.615846 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" podUID="7fb26181-9fdc-4f96-be2c-85fbaa5f21b7" Jan 24 00:55:42.618127 kubelet[2863]: E0124 00:55:42.617982 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:55:45.397324 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:55:45.397484 kernel: audit: type=1130 audit(1769216145.378:1530): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@95-65.109.167.77:22-4.153.228.146:46798 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:55:45.378000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@95-65.109.167.77:22-4.153.228.146:46798 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:55:45.379544 systemd[1]: Started sshd@95-65.109.167.77:22-4.153.228.146:46798.service - OpenSSH per-connection server daemon (4.153.228.146:46798). Jan 24 00:55:46.084000 audit[7257]: USER_ACCT pid=7257 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:46.089422 sshd-session[7257]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:55:46.090500 sshd[7257]: Accepted publickey for core from 4.153.228.146 port 46798 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:55:46.092597 kernel: audit: type=1101 audit(1769216146.084:1531): pid=7257 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:46.087000 audit[7257]: CRED_ACQ pid=7257 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:46.102200 kernel: audit: type=1103 audit(1769216146.087:1532): pid=7257 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:46.102271 kernel: audit: type=1006 audit(1769216146.087:1533): pid=7257 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=96 res=1 Jan 24 00:55:46.105864 systemd-logind[1656]: New session 96 of user core. Jan 24 00:55:46.087000 audit[7257]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff800f3180 a2=3 a3=0 items=0 ppid=1 pid=7257 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=96 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:55:46.115764 kernel: audit: type=1300 audit(1769216146.087:1533): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff800f3180 a2=3 a3=0 items=0 ppid=1 pid=7257 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=96 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:55:46.115848 kernel: audit: type=1327 audit(1769216146.087:1533): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:55:46.087000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:55:46.116084 systemd[1]: Started session-96.scope - Session 96 of User core. Jan 24 00:55:46.119000 audit[7257]: USER_START pid=7257 uid=0 auid=500 ses=96 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:46.128163 kernel: audit: type=1105 audit(1769216146.119:1534): pid=7257 uid=0 auid=500 ses=96 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:46.127000 audit[7261]: CRED_ACQ pid=7261 uid=0 auid=500 ses=96 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:46.138107 kernel: audit: type=1103 audit(1769216146.127:1535): pid=7261 uid=0 auid=500 ses=96 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:46.577113 sshd[7261]: Connection closed by 4.153.228.146 port 46798 Jan 24 00:55:46.577868 sshd-session[7257]: pam_unix(sshd:session): session closed for user core Jan 24 00:55:46.579000 audit[7257]: USER_END pid=7257 uid=0 auid=500 ses=96 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:46.595280 kernel: audit: type=1106 audit(1769216146.579:1536): pid=7257 uid=0 auid=500 ses=96 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:46.608836 kernel: audit: type=1104 audit(1769216146.579:1537): pid=7257 uid=0 auid=500 ses=96 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:46.579000 audit[7257]: CRED_DISP pid=7257 uid=0 auid=500 ses=96 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:46.598263 systemd[1]: sshd@95-65.109.167.77:22-4.153.228.146:46798.service: Deactivated successfully. Jan 24 00:55:46.604736 systemd[1]: session-96.scope: Deactivated successfully. Jan 24 00:55:46.596000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@95-65.109.167.77:22-4.153.228.146:46798 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:55:46.610629 systemd-logind[1656]: Session 96 logged out. Waiting for processes to exit. Jan 24 00:55:46.614816 systemd-logind[1656]: Removed session 96. Jan 24 00:55:47.617285 kubelet[2863]: E0124 00:55:47.617250 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qhldj" podUID="d0cc0ca8-4b85-478b-b9c2-c61b42d93c89" Jan 24 00:55:48.615617 kubelet[2863]: E0124 00:55:48.615556 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" podUID="d83d59e3-6296-40d9-bb63-5a69b654ac0c" Jan 24 00:55:51.617856 kubelet[2863]: E0124 00:55:51.617472 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58c54c478-wd6fd" podUID="dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5" Jan 24 00:55:51.727670 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:55:51.727827 kernel: audit: type=1130 audit(1769216151.715:1539): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@96-65.109.167.77:22-4.153.228.146:46802 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:55:51.715000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@96-65.109.167.77:22-4.153.228.146:46802 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:55:51.716577 systemd[1]: Started sshd@96-65.109.167.77:22-4.153.228.146:46802.service - OpenSSH per-connection server daemon (4.153.228.146:46802). Jan 24 00:55:52.406000 audit[7275]: USER_ACCT pid=7275 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:52.416587 kernel: audit: type=1101 audit(1769216152.406:1540): pid=7275 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:52.416436 sshd-session[7275]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:55:52.416917 sshd[7275]: Accepted publickey for core from 4.153.228.146 port 46802 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:55:52.414000 audit[7275]: CRED_ACQ pid=7275 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:52.426103 kernel: audit: type=1103 audit(1769216152.414:1541): pid=7275 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:52.429503 systemd-logind[1656]: New session 97 of user core. Jan 24 00:55:52.434095 kernel: audit: type=1006 audit(1769216152.414:1542): pid=7275 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=97 res=1 Jan 24 00:55:52.414000 audit[7275]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd559b50a0 a2=3 a3=0 items=0 ppid=1 pid=7275 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=97 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:55:52.443094 kernel: audit: type=1300 audit(1769216152.414:1542): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd559b50a0 a2=3 a3=0 items=0 ppid=1 pid=7275 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=97 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:55:52.444239 systemd[1]: Started session-97.scope - Session 97 of User core. Jan 24 00:55:52.414000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:55:52.450093 kernel: audit: type=1327 audit(1769216152.414:1542): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:55:52.450000 audit[7275]: USER_START pid=7275 uid=0 auid=500 ses=97 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:52.463100 kernel: audit: type=1105 audit(1769216152.450:1543): pid=7275 uid=0 auid=500 ses=97 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:52.454000 audit[7279]: CRED_ACQ pid=7279 uid=0 auid=500 ses=97 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:52.473146 kernel: audit: type=1103 audit(1769216152.454:1544): pid=7279 uid=0 auid=500 ses=97 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:52.860107 sshd[7279]: Connection closed by 4.153.228.146 port 46802 Jan 24 00:55:52.858872 sshd-session[7275]: pam_unix(sshd:session): session closed for user core Jan 24 00:55:52.880337 kernel: audit: type=1106 audit(1769216152.861:1545): pid=7275 uid=0 auid=500 ses=97 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:52.861000 audit[7275]: USER_END pid=7275 uid=0 auid=500 ses=97 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:52.870744 systemd-logind[1656]: Session 97 logged out. Waiting for processes to exit. Jan 24 00:55:52.872945 systemd[1]: sshd@96-65.109.167.77:22-4.153.228.146:46802.service: Deactivated successfully. Jan 24 00:55:52.861000 audit[7275]: CRED_DISP pid=7275 uid=0 auid=500 ses=97 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:52.884225 systemd[1]: session-97.scope: Deactivated successfully. Jan 24 00:55:52.895460 systemd-logind[1656]: Removed session 97. Jan 24 00:55:52.898747 kernel: audit: type=1104 audit(1769216152.861:1546): pid=7275 uid=0 auid=500 ses=97 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:52.871000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@96-65.109.167.77:22-4.153.228.146:46802 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:55:53.619091 kubelet[2863]: E0124 00:55:53.618358 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:55:53.620596 kubelet[2863]: E0124 00:55:53.620528 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" podUID="7fb26181-9fdc-4f96-be2c-85fbaa5f21b7" Jan 24 00:55:55.620552 kubelet[2863]: E0124 00:55:55.620498 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:55:57.995000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@97-65.109.167.77:22-4.153.228.146:56262 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:55:57.996735 systemd[1]: Started sshd@97-65.109.167.77:22-4.153.228.146:56262.service - OpenSSH per-connection server daemon (4.153.228.146:56262). Jan 24 00:55:57.998395 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:55:57.999210 kernel: audit: type=1130 audit(1769216157.995:1548): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@97-65.109.167.77:22-4.153.228.146:56262 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:55:58.688379 sshd[7299]: Accepted publickey for core from 4.153.228.146 port 56262 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:55:58.687000 audit[7299]: USER_ACCT pid=7299 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:58.696347 kernel: audit: type=1101 audit(1769216158.687:1549): pid=7299 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:58.696118 sshd-session[7299]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:55:58.690000 audit[7299]: CRED_ACQ pid=7299 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:58.704100 kernel: audit: type=1103 audit(1769216158.690:1550): pid=7299 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:58.704619 kernel: audit: type=1006 audit(1769216158.690:1551): pid=7299 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=98 res=1 Jan 24 00:55:58.690000 audit[7299]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffdcef348d0 a2=3 a3=0 items=0 ppid=1 pid=7299 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=98 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:55:58.714292 systemd-logind[1656]: New session 98 of user core. Jan 24 00:55:58.719197 kernel: audit: type=1300 audit(1769216158.690:1551): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffdcef348d0 a2=3 a3=0 items=0 ppid=1 pid=7299 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=98 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:55:58.690000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:55:58.723191 systemd[1]: Started session-98.scope - Session 98 of User core. Jan 24 00:55:58.725588 kernel: audit: type=1327 audit(1769216158.690:1551): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:55:58.726000 audit[7299]: USER_START pid=7299 uid=0 auid=500 ses=98 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:58.735078 kernel: audit: type=1105 audit(1769216158.726:1552): pid=7299 uid=0 auid=500 ses=98 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:58.734000 audit[7303]: CRED_ACQ pid=7303 uid=0 auid=500 ses=98 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:58.744075 kernel: audit: type=1103 audit(1769216158.734:1553): pid=7303 uid=0 auid=500 ses=98 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:59.112596 sshd[7303]: Connection closed by 4.153.228.146 port 56262 Jan 24 00:55:59.113938 sshd-session[7299]: pam_unix(sshd:session): session closed for user core Jan 24 00:55:59.115000 audit[7299]: USER_END pid=7299 uid=0 auid=500 ses=98 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:59.125942 kernel: audit: type=1106 audit(1769216159.115:1554): pid=7299 uid=0 auid=500 ses=98 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:59.124028 systemd-logind[1656]: Session 98 logged out. Waiting for processes to exit. Jan 24 00:55:59.115000 audit[7299]: CRED_DISP pid=7299 uid=0 auid=500 ses=98 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:59.127620 systemd[1]: sshd@97-65.109.167.77:22-4.153.228.146:56262.service: Deactivated successfully. Jan 24 00:55:59.132525 systemd[1]: session-98.scope: Deactivated successfully. Jan 24 00:55:59.134208 kernel: audit: type=1104 audit(1769216159.115:1555): pid=7299 uid=0 auid=500 ses=98 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:55:59.126000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@97-65.109.167.77:22-4.153.228.146:56262 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:55:59.137610 systemd-logind[1656]: Removed session 98. Jan 24 00:56:00.618770 kubelet[2863]: E0124 00:56:00.618716 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" podUID="d83d59e3-6296-40d9-bb63-5a69b654ac0c" Jan 24 00:56:02.616737 kubelet[2863]: E0124 00:56:02.616636 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qhldj" podUID="d0cc0ca8-4b85-478b-b9c2-c61b42d93c89" Jan 24 00:56:03.620102 kubelet[2863]: E0124 00:56:03.619357 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58c54c478-wd6fd" podUID="dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5" Jan 24 00:56:04.258335 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:56:04.258448 kernel: audit: type=1130 audit(1769216164.253:1557): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@98-65.109.167.77:22-4.153.228.146:56278 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:56:04.253000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@98-65.109.167.77:22-4.153.228.146:56278 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:56:04.254014 systemd[1]: Started sshd@98-65.109.167.77:22-4.153.228.146:56278.service - OpenSSH per-connection server daemon (4.153.228.146:56278). Jan 24 00:56:04.940109 kernel: audit: type=1101 audit(1769216164.932:1558): pid=7341 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:04.940252 kernel: audit: type=1103 audit(1769216164.934:1559): pid=7341 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:04.932000 audit[7341]: USER_ACCT pid=7341 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:04.934000 audit[7341]: CRED_ACQ pid=7341 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:04.938155 sshd-session[7341]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:56:04.942348 sshd[7341]: Accepted publickey for core from 4.153.228.146 port 56278 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:56:04.934000 audit[7341]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff10ba2020 a2=3 a3=0 items=0 ppid=1 pid=7341 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=99 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:56:04.956178 kernel: audit: type=1006 audit(1769216164.934:1560): pid=7341 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=99 res=1 Jan 24 00:56:04.956282 kernel: audit: type=1300 audit(1769216164.934:1560): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff10ba2020 a2=3 a3=0 items=0 ppid=1 pid=7341 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=99 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:56:04.934000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:56:04.964681 kernel: audit: type=1327 audit(1769216164.934:1560): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:56:04.969086 systemd-logind[1656]: New session 99 of user core. Jan 24 00:56:04.976252 systemd[1]: Started session-99.scope - Session 99 of User core. Jan 24 00:56:04.981000 audit[7341]: USER_START pid=7341 uid=0 auid=500 ses=99 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:04.992245 kernel: audit: type=1105 audit(1769216164.981:1561): pid=7341 uid=0 auid=500 ses=99 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:04.991000 audit[7345]: CRED_ACQ pid=7345 uid=0 auid=500 ses=99 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:04.999115 kernel: audit: type=1103 audit(1769216164.991:1562): pid=7345 uid=0 auid=500 ses=99 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:05.351163 sshd[7345]: Connection closed by 4.153.228.146 port 56278 Jan 24 00:56:05.352983 sshd-session[7341]: pam_unix(sshd:session): session closed for user core Jan 24 00:56:05.356774 systemd-logind[1656]: Session 99 logged out. Waiting for processes to exit. Jan 24 00:56:05.352000 audit[7341]: USER_END pid=7341 uid=0 auid=500 ses=99 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:05.364713 systemd[1]: sshd@98-65.109.167.77:22-4.153.228.146:56278.service: Deactivated successfully. Jan 24 00:56:05.365595 kernel: audit: type=1106 audit(1769216165.352:1563): pid=7341 uid=0 auid=500 ses=99 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:05.366885 systemd[1]: session-99.scope: Deactivated successfully. Jan 24 00:56:05.368885 systemd-logind[1656]: Removed session 99. Jan 24 00:56:05.353000 audit[7341]: CRED_DISP pid=7341 uid=0 auid=500 ses=99 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:05.376224 kernel: audit: type=1104 audit(1769216165.353:1564): pid=7341 uid=0 auid=500 ses=99 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:05.363000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@98-65.109.167.77:22-4.153.228.146:56278 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:56:05.618878 kubelet[2863]: E0124 00:56:05.617851 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:56:07.617533 kubelet[2863]: E0124 00:56:07.616944 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" podUID="7fb26181-9fdc-4f96-be2c-85fbaa5f21b7" Jan 24 00:56:09.621128 kubelet[2863]: E0124 00:56:09.621036 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:56:10.484599 systemd[1]: Started sshd@99-65.109.167.77:22-4.153.228.146:34026.service - OpenSSH per-connection server daemon (4.153.228.146:34026). Jan 24 00:56:10.489109 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:56:10.489171 kernel: audit: type=1130 audit(1769216170.484:1566): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@99-65.109.167.77:22-4.153.228.146:34026 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:56:10.484000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@99-65.109.167.77:22-4.153.228.146:34026 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:56:11.187419 kernel: audit: type=1101 audit(1769216171.170:1567): pid=7357 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:11.170000 audit[7357]: USER_ACCT pid=7357 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:11.187717 sshd[7357]: Accepted publickey for core from 4.153.228.146 port 34026 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:56:11.189000 audit[7357]: CRED_ACQ pid=7357 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:11.194440 sshd-session[7357]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:56:11.206048 kernel: audit: type=1103 audit(1769216171.189:1568): pid=7357 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:11.232787 kernel: audit: type=1006 audit(1769216171.189:1569): pid=7357 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=100 res=1 Jan 24 00:56:11.233153 kernel: audit: type=1300 audit(1769216171.189:1569): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd7dd5e9a0 a2=3 a3=0 items=0 ppid=1 pid=7357 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=100 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:56:11.189000 audit[7357]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd7dd5e9a0 a2=3 a3=0 items=0 ppid=1 pid=7357 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=100 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:56:11.189000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:56:11.240098 kernel: audit: type=1327 audit(1769216171.189:1569): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:56:11.244771 systemd-logind[1656]: New session 100 of user core. Jan 24 00:56:11.253338 systemd[1]: Started session-100.scope - Session 100 of User core. Jan 24 00:56:11.261000 audit[7357]: USER_START pid=7357 uid=0 auid=500 ses=100 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:11.265000 audit[7361]: CRED_ACQ pid=7361 uid=0 auid=500 ses=100 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:11.278772 kernel: audit: type=1105 audit(1769216171.261:1570): pid=7357 uid=0 auid=500 ses=100 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:11.278893 kernel: audit: type=1103 audit(1769216171.265:1571): pid=7361 uid=0 auid=500 ses=100 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:11.617094 sshd[7361]: Connection closed by 4.153.228.146 port 34026 Jan 24 00:56:11.620784 sshd-session[7357]: pam_unix(sshd:session): session closed for user core Jan 24 00:56:11.634841 kernel: audit: type=1106 audit(1769216171.624:1572): pid=7357 uid=0 auid=500 ses=100 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:11.624000 audit[7357]: USER_END pid=7357 uid=0 auid=500 ses=100 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:11.631783 systemd[1]: sshd@99-65.109.167.77:22-4.153.228.146:34026.service: Deactivated successfully. Jan 24 00:56:11.636722 systemd[1]: session-100.scope: Deactivated successfully. Jan 24 00:56:11.624000 audit[7357]: CRED_DISP pid=7357 uid=0 auid=500 ses=100 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:11.641611 systemd-logind[1656]: Session 100 logged out. Waiting for processes to exit. Jan 24 00:56:11.645009 systemd-logind[1656]: Removed session 100. Jan 24 00:56:11.631000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@99-65.109.167.77:22-4.153.228.146:34026 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:56:11.648085 kernel: audit: type=1104 audit(1769216171.624:1573): pid=7357 uid=0 auid=500 ses=100 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:13.617746 kubelet[2863]: E0124 00:56:13.617587 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qhldj" podUID="d0cc0ca8-4b85-478b-b9c2-c61b42d93c89" Jan 24 00:56:14.616463 kubelet[2863]: E0124 00:56:14.616398 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" podUID="d83d59e3-6296-40d9-bb63-5a69b654ac0c" Jan 24 00:56:14.621076 kubelet[2863]: E0124 00:56:14.620127 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58c54c478-wd6fd" podUID="dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5" Jan 24 00:56:16.756000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@100-65.109.167.77:22-4.153.228.146:52188 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:56:16.757597 systemd[1]: Started sshd@100-65.109.167.77:22-4.153.228.146:52188.service - OpenSSH per-connection server daemon (4.153.228.146:52188). Jan 24 00:56:16.760029 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:56:16.760116 kernel: audit: type=1130 audit(1769216176.756:1575): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@100-65.109.167.77:22-4.153.228.146:52188 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:56:17.450000 audit[7373]: USER_ACCT pid=7373 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:17.456095 sshd[7373]: Accepted publickey for core from 4.153.228.146 port 52188 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:56:17.459364 kernel: audit: type=1101 audit(1769216177.450:1576): pid=7373 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:17.458580 sshd-session[7373]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:56:17.453000 audit[7373]: CRED_ACQ pid=7373 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:17.468125 kernel: audit: type=1103 audit(1769216177.453:1577): pid=7373 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:17.475814 kernel: audit: type=1006 audit(1769216177.453:1578): pid=7373 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=101 res=1 Jan 24 00:56:17.475913 kernel: audit: type=1300 audit(1769216177.453:1578): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffed4aaea50 a2=3 a3=0 items=0 ppid=1 pid=7373 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=101 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:56:17.453000 audit[7373]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffed4aaea50 a2=3 a3=0 items=0 ppid=1 pid=7373 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=101 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:56:17.472698 systemd-logind[1656]: New session 101 of user core. Jan 24 00:56:17.453000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:56:17.484220 systemd[1]: Started session-101.scope - Session 101 of User core. Jan 24 00:56:17.487430 kernel: audit: type=1327 audit(1769216177.453:1578): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:56:17.488000 audit[7373]: USER_START pid=7373 uid=0 auid=500 ses=101 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:17.497079 kernel: audit: type=1105 audit(1769216177.488:1579): pid=7373 uid=0 auid=500 ses=101 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:17.495000 audit[7377]: CRED_ACQ pid=7377 uid=0 auid=500 ses=101 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:17.504091 kernel: audit: type=1103 audit(1769216177.495:1580): pid=7377 uid=0 auid=500 ses=101 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:17.871727 sshd[7377]: Connection closed by 4.153.228.146 port 52188 Jan 24 00:56:17.873222 sshd-session[7373]: pam_unix(sshd:session): session closed for user core Jan 24 00:56:17.873000 audit[7373]: USER_END pid=7373 uid=0 auid=500 ses=101 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:17.878649 systemd-logind[1656]: Session 101 logged out. Waiting for processes to exit. Jan 24 00:56:17.879405 systemd[1]: sshd@100-65.109.167.77:22-4.153.228.146:52188.service: Deactivated successfully. Jan 24 00:56:17.882363 kernel: audit: type=1106 audit(1769216177.873:1581): pid=7373 uid=0 auid=500 ses=101 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:17.883655 systemd[1]: session-101.scope: Deactivated successfully. Jan 24 00:56:17.873000 audit[7373]: CRED_DISP pid=7373 uid=0 auid=500 ses=101 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:17.889865 systemd-logind[1656]: Removed session 101. Jan 24 00:56:17.891120 kernel: audit: type=1104 audit(1769216177.873:1582): pid=7373 uid=0 auid=500 ses=101 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:17.875000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@100-65.109.167.77:22-4.153.228.146:52188 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:56:20.616451 kubelet[2863]: E0124 00:56:20.616387 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:56:21.617304 kubelet[2863]: E0124 00:56:21.617158 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" podUID="7fb26181-9fdc-4f96-be2c-85fbaa5f21b7" Jan 24 00:56:23.006212 systemd[1]: Started sshd@101-65.109.167.77:22-4.153.228.146:52196.service - OpenSSH per-connection server daemon (4.153.228.146:52196). Jan 24 00:56:23.013564 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:56:23.013629 kernel: audit: type=1130 audit(1769216183.005:1584): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@101-65.109.167.77:22-4.153.228.146:52196 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:56:23.005000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@101-65.109.167.77:22-4.153.228.146:52196 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:56:23.686000 audit[7390]: USER_ACCT pid=7390 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:23.688899 sshd[7390]: Accepted publickey for core from 4.153.228.146 port 52196 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:56:23.693184 sshd-session[7390]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:56:23.715147 kernel: audit: type=1101 audit(1769216183.686:1585): pid=7390 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:23.715253 kernel: audit: type=1103 audit(1769216183.689:1586): pid=7390 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:23.689000 audit[7390]: CRED_ACQ pid=7390 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:23.711415 systemd-logind[1656]: New session 102 of user core. Jan 24 00:56:23.717451 systemd[1]: Started session-102.scope - Session 102 of User core. Jan 24 00:56:23.720865 kernel: audit: type=1006 audit(1769216183.689:1587): pid=7390 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=102 res=1 Jan 24 00:56:23.689000 audit[7390]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe2a962a20 a2=3 a3=0 items=0 ppid=1 pid=7390 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=102 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:56:23.742430 kernel: audit: type=1300 audit(1769216183.689:1587): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe2a962a20 a2=3 a3=0 items=0 ppid=1 pid=7390 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=102 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:56:23.742679 kernel: audit: type=1327 audit(1769216183.689:1587): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:56:23.689000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:56:23.731000 audit[7390]: USER_START pid=7390 uid=0 auid=500 ses=102 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:23.779551 kernel: audit: type=1105 audit(1769216183.731:1588): pid=7390 uid=0 auid=500 ses=102 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:23.779684 kernel: audit: type=1103 audit(1769216183.743:1589): pid=7396 uid=0 auid=500 ses=102 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:23.743000 audit[7396]: CRED_ACQ pid=7396 uid=0 auid=500 ses=102 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:24.179895 sshd[7396]: Connection closed by 4.153.228.146 port 52196 Jan 24 00:56:24.182230 sshd-session[7390]: pam_unix(sshd:session): session closed for user core Jan 24 00:56:24.182000 audit[7390]: USER_END pid=7390 uid=0 auid=500 ses=102 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:24.193347 kernel: audit: type=1106 audit(1769216184.182:1590): pid=7390 uid=0 auid=500 ses=102 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:24.182000 audit[7390]: CRED_DISP pid=7390 uid=0 auid=500 ses=102 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:24.193971 systemd[1]: sshd@101-65.109.167.77:22-4.153.228.146:52196.service: Deactivated successfully. Jan 24 00:56:24.198018 systemd[1]: session-102.scope: Deactivated successfully. Jan 24 00:56:24.201118 kernel: audit: type=1104 audit(1769216184.182:1591): pid=7390 uid=0 auid=500 ses=102 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:24.193000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@101-65.109.167.77:22-4.153.228.146:52196 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:56:24.201312 systemd-logind[1656]: Session 102 logged out. Waiting for processes to exit. Jan 24 00:56:24.203809 systemd-logind[1656]: Removed session 102. Jan 24 00:56:24.616816 kubelet[2863]: E0124 00:56:24.616769 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:56:25.625264 kubelet[2863]: E0124 00:56:25.625180 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58c54c478-wd6fd" podUID="dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5" Jan 24 00:56:27.617894 kubelet[2863]: E0124 00:56:27.617712 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qhldj" podUID="d0cc0ca8-4b85-478b-b9c2-c61b42d93c89" Jan 24 00:56:28.615814 kubelet[2863]: E0124 00:56:28.615729 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" podUID="d83d59e3-6296-40d9-bb63-5a69b654ac0c" Jan 24 00:56:29.311560 systemd[1]: Started sshd@102-65.109.167.77:22-4.153.228.146:45914.service - OpenSSH per-connection server daemon (4.153.228.146:45914). Jan 24 00:56:29.310000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@102-65.109.167.77:22-4.153.228.146:45914 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:56:29.319471 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:56:29.319539 kernel: audit: type=1130 audit(1769216189.310:1593): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@102-65.109.167.77:22-4.153.228.146:45914 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:56:29.996000 audit[7408]: USER_ACCT pid=7408 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:30.002739 sshd-session[7408]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:56:30.006110 sshd[7408]: Accepted publickey for core from 4.153.228.146 port 45914 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:56:29.996000 audit[7408]: CRED_ACQ pid=7408 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:30.016604 kernel: audit: type=1101 audit(1769216189.996:1594): pid=7408 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:30.016691 kernel: audit: type=1103 audit(1769216189.996:1595): pid=7408 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:30.038126 kernel: audit: type=1006 audit(1769216189.996:1596): pid=7408 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=103 res=1 Jan 24 00:56:30.036420 systemd-logind[1656]: New session 103 of user core. Jan 24 00:56:29.996000 audit[7408]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe93bf5160 a2=3 a3=0 items=0 ppid=1 pid=7408 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=103 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:56:29.996000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:56:30.054620 kernel: audit: type=1300 audit(1769216189.996:1596): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe93bf5160 a2=3 a3=0 items=0 ppid=1 pid=7408 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=103 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:56:30.054810 kernel: audit: type=1327 audit(1769216189.996:1596): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:56:30.055363 systemd[1]: Started session-103.scope - Session 103 of User core. Jan 24 00:56:30.062000 audit[7408]: USER_START pid=7408 uid=0 auid=500 ses=103 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:30.069000 audit[7414]: CRED_ACQ pid=7414 uid=0 auid=500 ses=103 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:30.082057 kernel: audit: type=1105 audit(1769216190.062:1597): pid=7408 uid=0 auid=500 ses=103 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:30.082388 kernel: audit: type=1103 audit(1769216190.069:1598): pid=7414 uid=0 auid=500 ses=103 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:30.478231 sshd[7414]: Connection closed by 4.153.228.146 port 45914 Jan 24 00:56:30.479057 sshd-session[7408]: pam_unix(sshd:session): session closed for user core Jan 24 00:56:30.481000 audit[7408]: USER_END pid=7408 uid=0 auid=500 ses=103 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:30.487682 systemd-logind[1656]: Session 103 logged out. Waiting for processes to exit. Jan 24 00:56:30.488479 systemd[1]: sshd@102-65.109.167.77:22-4.153.228.146:45914.service: Deactivated successfully. Jan 24 00:56:30.491288 kernel: audit: type=1106 audit(1769216190.481:1599): pid=7408 uid=0 auid=500 ses=103 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:30.492931 systemd[1]: session-103.scope: Deactivated successfully. Jan 24 00:56:30.502140 kernel: audit: type=1104 audit(1769216190.481:1600): pid=7408 uid=0 auid=500 ses=103 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:30.481000 audit[7408]: CRED_DISP pid=7408 uid=0 auid=500 ses=103 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:30.497829 systemd-logind[1656]: Removed session 103. Jan 24 00:56:30.484000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@102-65.109.167.77:22-4.153.228.146:45914 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:56:32.617302 kubelet[2863]: E0124 00:56:32.617121 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:56:32.618324 kubelet[2863]: E0124 00:56:32.618045 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" podUID="7fb26181-9fdc-4f96-be2c-85fbaa5f21b7" Jan 24 00:56:35.614276 systemd[1]: Started sshd@103-65.109.167.77:22-4.153.228.146:47944.service - OpenSSH per-connection server daemon (4.153.228.146:47944). Jan 24 00:56:35.622282 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:56:35.622311 kernel: audit: type=1130 audit(1769216195.613:1602): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@103-65.109.167.77:22-4.153.228.146:47944 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:56:35.613000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@103-65.109.167.77:22-4.153.228.146:47944 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:56:36.273000 audit[7451]: USER_ACCT pid=7451 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:36.283415 sshd[7451]: Accepted publickey for core from 4.153.228.146 port 47944 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:56:36.303989 kernel: audit: type=1101 audit(1769216196.273:1603): pid=7451 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:36.304152 kernel: audit: type=1103 audit(1769216196.283:1604): pid=7451 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:36.283000 audit[7451]: CRED_ACQ pid=7451 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:36.292949 sshd-session[7451]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:56:36.283000 audit[7451]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffcdb8f4150 a2=3 a3=0 items=0 ppid=1 pid=7451 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=104 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:56:36.320608 kernel: audit: type=1006 audit(1769216196.283:1605): pid=7451 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=104 res=1 Jan 24 00:56:36.320745 kernel: audit: type=1300 audit(1769216196.283:1605): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffcdb8f4150 a2=3 a3=0 items=0 ppid=1 pid=7451 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=104 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:56:36.321277 systemd-logind[1656]: New session 104 of user core. Jan 24 00:56:36.283000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:56:36.332882 kernel: audit: type=1327 audit(1769216196.283:1605): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:56:36.338354 systemd[1]: Started session-104.scope - Session 104 of User core. Jan 24 00:56:36.347000 audit[7451]: USER_START pid=7451 uid=0 auid=500 ses=104 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:36.364091 kernel: audit: type=1105 audit(1769216196.347:1606): pid=7451 uid=0 auid=500 ses=104 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:36.365000 audit[7455]: CRED_ACQ pid=7455 uid=0 auid=500 ses=104 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:36.378135 kernel: audit: type=1103 audit(1769216196.365:1607): pid=7455 uid=0 auid=500 ses=104 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:36.784294 sshd[7455]: Connection closed by 4.153.228.146 port 47944 Jan 24 00:56:36.787707 sshd-session[7451]: pam_unix(sshd:session): session closed for user core Jan 24 00:56:36.790000 audit[7451]: USER_END pid=7451 uid=0 auid=500 ses=104 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:36.798166 systemd-logind[1656]: Session 104 logged out. Waiting for processes to exit. Jan 24 00:56:36.801397 systemd[1]: sshd@103-65.109.167.77:22-4.153.228.146:47944.service: Deactivated successfully. Jan 24 00:56:36.810132 systemd[1]: session-104.scope: Deactivated successfully. Jan 24 00:56:36.813139 kernel: audit: type=1106 audit(1769216196.790:1608): pid=7451 uid=0 auid=500 ses=104 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:36.818395 systemd-logind[1656]: Removed session 104. Jan 24 00:56:36.791000 audit[7451]: CRED_DISP pid=7451 uid=0 auid=500 ses=104 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:36.832093 kernel: audit: type=1104 audit(1769216196.791:1609): pid=7451 uid=0 auid=500 ses=104 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:36.801000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@103-65.109.167.77:22-4.153.228.146:47944 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:56:37.617378 kubelet[2863]: E0124 00:56:37.617092 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:56:39.617028 kubelet[2863]: E0124 00:56:39.616658 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" podUID="d83d59e3-6296-40d9-bb63-5a69b654ac0c" Jan 24 00:56:40.617403 kubelet[2863]: E0124 00:56:40.616749 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qhldj" podUID="d0cc0ca8-4b85-478b-b9c2-c61b42d93c89" Jan 24 00:56:40.623406 containerd[1682]: time="2026-01-24T00:56:40.623347120Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 24 00:56:41.068108 containerd[1682]: time="2026-01-24T00:56:41.067567458Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 24 00:56:41.069520 containerd[1682]: time="2026-01-24T00:56:41.069376547Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 24 00:56:41.069520 containerd[1682]: time="2026-01-24T00:56:41.069473616Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 24 00:56:41.069755 kubelet[2863]: E0124 00:56:41.069670 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 24 00:56:41.069755 kubelet[2863]: E0124 00:56:41.069756 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 24 00:56:41.072298 kubelet[2863]: E0124 00:56:41.072224 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:c13ad1f9d9cd499a81c814dc308eb491,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dlnt8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-58c54c478-wd6fd_calico-system(dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 24 00:56:41.076310 containerd[1682]: time="2026-01-24T00:56:41.076159844Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 24 00:56:41.513907 containerd[1682]: time="2026-01-24T00:56:41.513758165Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 24 00:56:41.515199 containerd[1682]: time="2026-01-24T00:56:41.515125644Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 24 00:56:41.515199 containerd[1682]: time="2026-01-24T00:56:41.515185674Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 24 00:56:41.515350 kubelet[2863]: E0124 00:56:41.515314 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 24 00:56:41.515419 kubelet[2863]: E0124 00:56:41.515354 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 24 00:56:41.515603 kubelet[2863]: E0124 00:56:41.515453 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dlnt8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-58c54c478-wd6fd_calico-system(dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 24 00:56:41.516782 kubelet[2863]: E0124 00:56:41.516752 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58c54c478-wd6fd" podUID="dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5" Jan 24 00:56:41.921520 systemd[1]: Started sshd@104-65.109.167.77:22-4.153.228.146:47956.service - OpenSSH per-connection server daemon (4.153.228.146:47956). Jan 24 00:56:41.921000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@104-65.109.167.77:22-4.153.228.146:47956 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:56:41.928792 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:56:41.928857 kernel: audit: type=1130 audit(1769216201.921:1611): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@104-65.109.167.77:22-4.153.228.146:47956 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:56:42.590000 audit[7467]: USER_ACCT pid=7467 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:42.600283 sshd-session[7467]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:56:42.605981 sshd[7467]: Accepted publickey for core from 4.153.228.146 port 47956 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:56:42.613546 kernel: audit: type=1101 audit(1769216202.590:1612): pid=7467 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:42.613754 kernel: audit: type=1103 audit(1769216202.590:1613): pid=7467 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:42.590000 audit[7467]: CRED_ACQ pid=7467 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:42.621510 kernel: audit: type=1006 audit(1769216202.595:1614): pid=7467 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=105 res=1 Jan 24 00:56:42.630238 kernel: audit: type=1300 audit(1769216202.595:1614): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe1f79e9b0 a2=3 a3=0 items=0 ppid=1 pid=7467 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=105 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:56:42.595000 audit[7467]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe1f79e9b0 a2=3 a3=0 items=0 ppid=1 pid=7467 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=105 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:56:42.641144 systemd-logind[1656]: New session 105 of user core. Jan 24 00:56:42.595000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:56:42.651182 kernel: audit: type=1327 audit(1769216202.595:1614): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:56:42.652333 systemd[1]: Started session-105.scope - Session 105 of User core. Jan 24 00:56:42.658000 audit[7467]: USER_START pid=7467 uid=0 auid=500 ses=105 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:42.688311 kernel: audit: type=1105 audit(1769216202.658:1615): pid=7467 uid=0 auid=500 ses=105 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:42.688438 kernel: audit: type=1103 audit(1769216202.663:1616): pid=7471 uid=0 auid=500 ses=105 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:42.663000 audit[7471]: CRED_ACQ pid=7471 uid=0 auid=500 ses=105 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:43.085531 sshd[7471]: Connection closed by 4.153.228.146 port 47956 Jan 24 00:56:43.087421 sshd-session[7467]: pam_unix(sshd:session): session closed for user core Jan 24 00:56:43.109134 kernel: audit: type=1106 audit(1769216203.090:1617): pid=7467 uid=0 auid=500 ses=105 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:43.090000 audit[7467]: USER_END pid=7467 uid=0 auid=500 ses=105 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:43.097919 systemd[1]: sshd@104-65.109.167.77:22-4.153.228.146:47956.service: Deactivated successfully. Jan 24 00:56:43.103967 systemd[1]: session-105.scope: Deactivated successfully. Jan 24 00:56:43.108380 systemd-logind[1656]: Session 105 logged out. Waiting for processes to exit. Jan 24 00:56:43.114221 systemd-logind[1656]: Removed session 105. Jan 24 00:56:43.090000 audit[7467]: CRED_DISP pid=7467 uid=0 auid=500 ses=105 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:43.129233 kernel: audit: type=1104 audit(1769216203.090:1618): pid=7467 uid=0 auid=500 ses=105 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:43.095000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@104-65.109.167.77:22-4.153.228.146:47956 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:56:44.616240 kubelet[2863]: E0124 00:56:44.616154 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:56:47.616314 kubelet[2863]: E0124 00:56:47.616131 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" podUID="7fb26181-9fdc-4f96-be2c-85fbaa5f21b7" Jan 24 00:56:48.218000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@105-65.109.167.77:22-4.153.228.146:47524 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:56:48.219284 systemd[1]: Started sshd@105-65.109.167.77:22-4.153.228.146:47524.service - OpenSSH per-connection server daemon (4.153.228.146:47524). Jan 24 00:56:48.220693 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:56:48.220838 kernel: audit: type=1130 audit(1769216208.218:1620): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@105-65.109.167.77:22-4.153.228.146:47524 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:56:48.616794 kubelet[2863]: E0124 00:56:48.616668 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:56:48.868000 audit[7483]: USER_ACCT pid=7483 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:48.871537 sshd-session[7483]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:56:48.875002 sshd[7483]: Accepted publickey for core from 4.153.228.146 port 47524 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:56:48.868000 audit[7483]: CRED_ACQ pid=7483 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:48.887972 systemd-logind[1656]: New session 106 of user core. Jan 24 00:56:48.890764 kernel: audit: type=1101 audit(1769216208.868:1621): pid=7483 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:48.890798 kernel: audit: type=1103 audit(1769216208.868:1622): pid=7483 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:48.903732 kernel: audit: type=1006 audit(1769216208.868:1623): pid=7483 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=106 res=1 Jan 24 00:56:48.868000 audit[7483]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffdd69dae10 a2=3 a3=0 items=0 ppid=1 pid=7483 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=106 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:56:48.915041 kernel: audit: type=1300 audit(1769216208.868:1623): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffdd69dae10 a2=3 a3=0 items=0 ppid=1 pid=7483 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=106 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:56:48.915425 systemd[1]: Started session-106.scope - Session 106 of User core. Jan 24 00:56:48.868000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:56:48.929179 kernel: audit: type=1327 audit(1769216208.868:1623): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:56:48.918000 audit[7483]: USER_START pid=7483 uid=0 auid=500 ses=106 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:48.937212 kernel: audit: type=1105 audit(1769216208.918:1624): pid=7483 uid=0 auid=500 ses=106 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:48.923000 audit[7487]: CRED_ACQ pid=7487 uid=0 auid=500 ses=106 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:48.950963 kernel: audit: type=1103 audit(1769216208.923:1625): pid=7487 uid=0 auid=500 ses=106 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:49.312294 sshd[7487]: Connection closed by 4.153.228.146 port 47524 Jan 24 00:56:49.314351 sshd-session[7483]: pam_unix(sshd:session): session closed for user core Jan 24 00:56:49.315000 audit[7483]: USER_END pid=7483 uid=0 auid=500 ses=106 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:49.330189 systemd[1]: sshd@105-65.109.167.77:22-4.153.228.146:47524.service: Deactivated successfully. Jan 24 00:56:49.335433 kernel: audit: type=1106 audit(1769216209.315:1626): pid=7483 uid=0 auid=500 ses=106 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:49.315000 audit[7483]: CRED_DISP pid=7483 uid=0 auid=500 ses=106 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:49.338338 systemd[1]: session-106.scope: Deactivated successfully. Jan 24 00:56:49.350223 kernel: audit: type=1104 audit(1769216209.315:1627): pid=7483 uid=0 auid=500 ses=106 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:49.342438 systemd-logind[1656]: Session 106 logged out. Waiting for processes to exit. Jan 24 00:56:49.349325 systemd-logind[1656]: Removed session 106. Jan 24 00:56:49.325000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@105-65.109.167.77:22-4.153.228.146:47524 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:56:51.616894 kubelet[2863]: E0124 00:56:51.616716 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" podUID="d83d59e3-6296-40d9-bb63-5a69b654ac0c" Jan 24 00:56:53.850000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@106-65.109.167.77:22-92.118.39.87:46830 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:56:53.851474 systemd[1]: Started sshd@106-65.109.167.77:22-92.118.39.87:46830.service - OpenSSH per-connection server daemon (92.118.39.87:46830). Jan 24 00:56:53.863445 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:56:53.863561 kernel: audit: type=1130 audit(1769216213.850:1629): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@106-65.109.167.77:22-92.118.39.87:46830 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:56:54.075840 sshd[7506]: Invalid user solana from 92.118.39.87 port 46830 Jan 24 00:56:54.123250 sshd[7506]: Connection closed by invalid user solana 92.118.39.87 port 46830 [preauth] Jan 24 00:56:54.122000 audit[7506]: USER_ERR pid=7506 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=92.118.39.87 addr=92.118.39.87 terminal=ssh res=failed' Jan 24 00:56:54.137186 kernel: audit: type=1109 audit(1769216214.122:1630): pid=7506 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=92.118.39.87 addr=92.118.39.87 terminal=ssh res=failed' Jan 24 00:56:54.139514 systemd[1]: sshd@106-65.109.167.77:22-92.118.39.87:46830.service: Deactivated successfully. Jan 24 00:56:54.138000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@106-65.109.167.77:22-92.118.39.87:46830 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:56:54.157100 kernel: audit: type=1131 audit(1769216214.138:1631): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@106-65.109.167.77:22-92.118.39.87:46830 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:56:54.446000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@107-65.109.167.77:22-4.153.228.146:47530 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:56:54.447389 systemd[1]: Started sshd@107-65.109.167.77:22-4.153.228.146:47530.service - OpenSSH per-connection server daemon (4.153.228.146:47530). Jan 24 00:56:54.453141 kernel: audit: type=1130 audit(1769216214.446:1632): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@107-65.109.167.77:22-4.153.228.146:47530 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:56:54.616931 containerd[1682]: time="2026-01-24T00:56:54.616899213Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 24 00:56:55.050246 containerd[1682]: time="2026-01-24T00:56:55.050151046Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 24 00:56:55.052360 containerd[1682]: time="2026-01-24T00:56:55.052318475Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 24 00:56:55.052989 containerd[1682]: time="2026-01-24T00:56:55.052413695Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 24 00:56:55.053034 kubelet[2863]: E0124 00:56:55.052667 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 24 00:56:55.053034 kubelet[2863]: E0124 00:56:55.052705 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 24 00:56:55.053034 kubelet[2863]: E0124 00:56:55.052835 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w7bh8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-qhldj_calico-system(d0cc0ca8-4b85-478b-b9c2-c61b42d93c89): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 24 00:56:55.054142 kubelet[2863]: E0124 00:56:55.054106 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qhldj" podUID="d0cc0ca8-4b85-478b-b9c2-c61b42d93c89" Jan 24 00:56:55.103000 audit[7512]: USER_ACCT pid=7512 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:55.109269 sshd-session[7512]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:56:55.113947 sshd[7512]: Accepted publickey for core from 4.153.228.146 port 47530 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:56:55.121425 kernel: audit: type=1101 audit(1769216215.103:1633): pid=7512 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:55.105000 audit[7512]: CRED_ACQ pid=7512 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:55.129182 systemd-logind[1656]: New session 107 of user core. Jan 24 00:56:55.136508 kernel: audit: type=1103 audit(1769216215.105:1634): pid=7512 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:55.136575 kernel: audit: type=1006 audit(1769216215.105:1635): pid=7512 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=107 res=1 Jan 24 00:56:55.105000 audit[7512]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff7e2136e0 a2=3 a3=0 items=0 ppid=1 pid=7512 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=107 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:56:55.149774 kernel: audit: type=1300 audit(1769216215.105:1635): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff7e2136e0 a2=3 a3=0 items=0 ppid=1 pid=7512 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=107 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:56:55.150460 systemd[1]: Started session-107.scope - Session 107 of User core. Jan 24 00:56:55.105000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:56:55.167013 kernel: audit: type=1327 audit(1769216215.105:1635): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:56:55.167158 kernel: audit: type=1105 audit(1769216215.165:1636): pid=7512 uid=0 auid=500 ses=107 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:55.165000 audit[7512]: USER_START pid=7512 uid=0 auid=500 ses=107 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:55.176000 audit[7516]: CRED_ACQ pid=7516 uid=0 auid=500 ses=107 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:55.580307 sshd[7516]: Connection closed by 4.153.228.146 port 47530 Jan 24 00:56:55.581302 sshd-session[7512]: pam_unix(sshd:session): session closed for user core Jan 24 00:56:55.584000 audit[7512]: USER_END pid=7512 uid=0 auid=500 ses=107 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:55.585000 audit[7512]: CRED_DISP pid=7512 uid=0 auid=500 ses=107 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:56:55.592409 systemd[1]: sshd@107-65.109.167.77:22-4.153.228.146:47530.service: Deactivated successfully. Jan 24 00:56:55.591000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@107-65.109.167.77:22-4.153.228.146:47530 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:56:55.597031 systemd[1]: session-107.scope: Deactivated successfully. Jan 24 00:56:55.599444 systemd-logind[1656]: Session 107 logged out. Waiting for processes to exit. Jan 24 00:56:55.601830 systemd-logind[1656]: Removed session 107. Jan 24 00:56:55.624778 kubelet[2863]: E0124 00:56:55.624178 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58c54c478-wd6fd" podUID="dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5" Jan 24 00:56:56.616244 containerd[1682]: time="2026-01-24T00:56:56.616208552Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 24 00:56:57.050280 containerd[1682]: time="2026-01-24T00:56:57.050114314Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 24 00:56:57.051977 containerd[1682]: time="2026-01-24T00:56:57.051923793Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 24 00:56:57.052147 containerd[1682]: time="2026-01-24T00:56:57.052037883Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 24 00:56:57.052329 kubelet[2863]: E0124 00:56:57.052274 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 24 00:56:57.053219 kubelet[2863]: E0124 00:56:57.052339 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 24 00:56:57.053219 kubelet[2863]: E0124 00:56:57.052532 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f5fsn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-58fdcd774c-w2drb_calico-system(f546e732-cf0b-44c7-9678-ae1cb31a23a4): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 24 00:56:57.054089 kubelet[2863]: E0124 00:56:57.054014 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:56:59.629425 containerd[1682]: time="2026-01-24T00:56:59.629186493Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 24 00:57:00.050552 containerd[1682]: time="2026-01-24T00:57:00.050420361Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 24 00:57:00.052788 containerd[1682]: time="2026-01-24T00:57:00.052728020Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 24 00:57:00.053314 containerd[1682]: time="2026-01-24T00:57:00.052859610Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 24 00:57:00.053355 kubelet[2863]: E0124 00:57:00.052974 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 24 00:57:00.053355 kubelet[2863]: E0124 00:57:00.053014 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 24 00:57:00.053355 kubelet[2863]: E0124 00:57:00.053137 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-llmb2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-7779cd58b4-jsxfd_calico-apiserver(7fb26181-9fdc-4f96-be2c-85fbaa5f21b7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 24 00:57:00.054434 kubelet[2863]: E0124 00:57:00.054379 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" podUID="7fb26181-9fdc-4f96-be2c-85fbaa5f21b7" Jan 24 00:57:00.721215 systemd[1]: Started sshd@108-65.109.167.77:22-4.153.228.146:43834.service - OpenSSH per-connection server daemon (4.153.228.146:43834). Jan 24 00:57:00.720000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@108-65.109.167.77:22-4.153.228.146:43834 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:57:00.724897 kernel: kauditd_printk_skb: 4 callbacks suppressed Jan 24 00:57:00.725427 kernel: audit: type=1130 audit(1769216220.720:1641): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@108-65.109.167.77:22-4.153.228.146:43834 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:57:01.444000 audit[7530]: USER_ACCT pid=7530 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:01.447869 sshd-session[7530]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:57:01.449574 sshd[7530]: Accepted publickey for core from 4.153.228.146 port 43834 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:57:01.444000 audit[7530]: CRED_ACQ pid=7530 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:01.453909 kernel: audit: type=1101 audit(1769216221.444:1642): pid=7530 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:01.453967 kernel: audit: type=1103 audit(1769216221.444:1643): pid=7530 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:01.463486 kernel: audit: type=1006 audit(1769216221.444:1644): pid=7530 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=108 res=1 Jan 24 00:57:01.463566 kernel: audit: type=1300 audit(1769216221.444:1644): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffdd7e55ef0 a2=3 a3=0 items=0 ppid=1 pid=7530 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=108 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:57:01.444000 audit[7530]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffdd7e55ef0 a2=3 a3=0 items=0 ppid=1 pid=7530 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=108 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:57:01.466207 systemd-logind[1656]: New session 108 of user core. Jan 24 00:57:01.444000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:57:01.472807 kernel: audit: type=1327 audit(1769216221.444:1644): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:57:01.474227 systemd[1]: Started session-108.scope - Session 108 of User core. Jan 24 00:57:01.479000 audit[7530]: USER_START pid=7530 uid=0 auid=500 ses=108 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:01.483000 audit[7534]: CRED_ACQ pid=7534 uid=0 auid=500 ses=108 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:01.490872 kernel: audit: type=1105 audit(1769216221.479:1645): pid=7530 uid=0 auid=500 ses=108 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:01.490932 kernel: audit: type=1103 audit(1769216221.483:1646): pid=7534 uid=0 auid=500 ses=108 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:01.929287 sshd[7534]: Connection closed by 4.153.228.146 port 43834 Jan 24 00:57:01.930448 sshd-session[7530]: pam_unix(sshd:session): session closed for user core Jan 24 00:57:01.933000 audit[7530]: USER_END pid=7530 uid=0 auid=500 ses=108 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:01.939254 systemd-logind[1656]: Session 108 logged out. Waiting for processes to exit. Jan 24 00:57:01.943123 systemd[1]: sshd@108-65.109.167.77:22-4.153.228.146:43834.service: Deactivated successfully. Jan 24 00:57:01.948237 systemd[1]: session-108.scope: Deactivated successfully. Jan 24 00:57:01.952312 kernel: audit: type=1106 audit(1769216221.933:1647): pid=7530 uid=0 auid=500 ses=108 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:01.953161 systemd-logind[1656]: Removed session 108. Jan 24 00:57:01.967745 kernel: audit: type=1104 audit(1769216221.933:1648): pid=7530 uid=0 auid=500 ses=108 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:01.933000 audit[7530]: CRED_DISP pid=7530 uid=0 auid=500 ses=108 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:01.943000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@108-65.109.167.77:22-4.153.228.146:43834 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:57:03.617677 containerd[1682]: time="2026-01-24T00:57:03.617594474Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 24 00:57:04.064135 containerd[1682]: time="2026-01-24T00:57:04.063951959Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 24 00:57:04.065711 containerd[1682]: time="2026-01-24T00:57:04.065621299Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 24 00:57:04.065711 containerd[1682]: time="2026-01-24T00:57:04.065684809Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 24 00:57:04.065963 kubelet[2863]: E0124 00:57:04.065936 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 24 00:57:04.067264 kubelet[2863]: E0124 00:57:04.067109 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 24 00:57:04.072269 kubelet[2863]: E0124 00:57:04.072229 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fs8rv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-njf24_calico-system(12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 24 00:57:04.076497 containerd[1682]: time="2026-01-24T00:57:04.076462523Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 24 00:57:04.506033 containerd[1682]: time="2026-01-24T00:57:04.505966228Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 24 00:57:04.507570 containerd[1682]: time="2026-01-24T00:57:04.507465037Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 24 00:57:04.508049 containerd[1682]: time="2026-01-24T00:57:04.507499507Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 24 00:57:04.508106 kubelet[2863]: E0124 00:57:04.507787 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 24 00:57:04.508106 kubelet[2863]: E0124 00:57:04.507830 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 24 00:57:04.508106 kubelet[2863]: E0124 00:57:04.507963 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fs8rv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-njf24_calico-system(12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 24 00:57:04.509462 kubelet[2863]: E0124 00:57:04.509425 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:57:06.616591 containerd[1682]: time="2026-01-24T00:57:06.616518750Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 24 00:57:07.055526 containerd[1682]: time="2026-01-24T00:57:07.054951030Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 24 00:57:07.057233 containerd[1682]: time="2026-01-24T00:57:07.057055249Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 24 00:57:07.057657 containerd[1682]: time="2026-01-24T00:57:07.057315259Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 24 00:57:07.057823 kubelet[2863]: E0124 00:57:07.057556 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 24 00:57:07.057823 kubelet[2863]: E0124 00:57:07.057713 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 24 00:57:07.060002 kubelet[2863]: E0124 00:57:07.057892 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tm5dq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-7779cd58b4-xp2rb_calico-apiserver(d83d59e3-6296-40d9-bb63-5a69b654ac0c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 24 00:57:07.060002 kubelet[2863]: E0124 00:57:07.059547 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" podUID="d83d59e3-6296-40d9-bb63-5a69b654ac0c" Jan 24 00:57:07.083673 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:57:07.083902 kernel: audit: type=1130 audit(1769216227.069:1650): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@109-65.109.167.77:22-4.153.228.146:47666 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:57:07.069000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@109-65.109.167.77:22-4.153.228.146:47666 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:57:07.070398 systemd[1]: Started sshd@109-65.109.167.77:22-4.153.228.146:47666.service - OpenSSH per-connection server daemon (4.153.228.146:47666). Jan 24 00:57:07.760000 audit[7569]: USER_ACCT pid=7569 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:07.763312 sshd[7569]: Accepted publickey for core from 4.153.228.146 port 47666 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:57:07.768157 kernel: audit: type=1101 audit(1769216227.760:1651): pid=7569 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:07.769512 sshd-session[7569]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:57:07.767000 audit[7569]: CRED_ACQ pid=7569 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:07.777140 kernel: audit: type=1103 audit(1769216227.767:1652): pid=7569 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:07.782082 kernel: audit: type=1006 audit(1769216227.767:1653): pid=7569 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=109 res=1 Jan 24 00:57:07.781980 systemd-logind[1656]: New session 109 of user core. Jan 24 00:57:07.767000 audit[7569]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fffe5f21b60 a2=3 a3=0 items=0 ppid=1 pid=7569 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=109 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:57:07.767000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:57:07.791629 kernel: audit: type=1300 audit(1769216227.767:1653): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fffe5f21b60 a2=3 a3=0 items=0 ppid=1 pid=7569 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=109 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:57:07.791705 kernel: audit: type=1327 audit(1769216227.767:1653): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:57:07.792249 systemd[1]: Started session-109.scope - Session 109 of User core. Jan 24 00:57:07.795000 audit[7569]: USER_START pid=7569 uid=0 auid=500 ses=109 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:07.804113 kernel: audit: type=1105 audit(1769216227.795:1654): pid=7569 uid=0 auid=500 ses=109 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:07.803000 audit[7573]: CRED_ACQ pid=7573 uid=0 auid=500 ses=109 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:07.810111 kernel: audit: type=1103 audit(1769216227.803:1655): pid=7573 uid=0 auid=500 ses=109 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:08.277031 sshd[7573]: Connection closed by 4.153.228.146 port 47666 Jan 24 00:57:08.277545 sshd-session[7569]: pam_unix(sshd:session): session closed for user core Jan 24 00:57:08.280000 audit[7569]: USER_END pid=7569 uid=0 auid=500 ses=109 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:08.296781 systemd[1]: sshd@109-65.109.167.77:22-4.153.228.146:47666.service: Deactivated successfully. Jan 24 00:57:08.300131 kernel: audit: type=1106 audit(1769216228.280:1656): pid=7569 uid=0 auid=500 ses=109 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:08.280000 audit[7569]: CRED_DISP pid=7569 uid=0 auid=500 ses=109 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:08.305217 systemd[1]: session-109.scope: Deactivated successfully. Jan 24 00:57:08.318188 kernel: audit: type=1104 audit(1769216228.280:1657): pid=7569 uid=0 auid=500 ses=109 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:08.318562 systemd-logind[1656]: Session 109 logged out. Waiting for processes to exit. Jan 24 00:57:08.294000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@109-65.109.167.77:22-4.153.228.146:47666 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:57:08.323000 systemd-logind[1656]: Removed session 109. Jan 24 00:57:08.618464 kubelet[2863]: E0124 00:57:08.618388 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58c54c478-wd6fd" podUID="dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5" Jan 24 00:57:10.619054 kubelet[2863]: E0124 00:57:10.617985 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qhldj" podUID="d0cc0ca8-4b85-478b-b9c2-c61b42d93c89" Jan 24 00:57:11.618157 kubelet[2863]: E0124 00:57:11.618053 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:57:13.412000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@110-65.109.167.77:22-4.153.228.146:47668 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:57:13.413305 systemd[1]: Started sshd@110-65.109.167.77:22-4.153.228.146:47668.service - OpenSSH per-connection server daemon (4.153.228.146:47668). Jan 24 00:57:13.414950 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:57:13.415165 kernel: audit: type=1130 audit(1769216233.412:1659): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@110-65.109.167.77:22-4.153.228.146:47668 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:57:14.109000 audit[7585]: USER_ACCT pid=7585 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:14.112458 sshd[7585]: Accepted publickey for core from 4.153.228.146 port 47668 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:57:14.117239 kernel: audit: type=1101 audit(1769216234.109:1660): pid=7585 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:14.118469 sshd-session[7585]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:57:14.111000 audit[7585]: CRED_ACQ pid=7585 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:14.126109 kernel: audit: type=1103 audit(1769216234.111:1661): pid=7585 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:14.111000 audit[7585]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffeec65e910 a2=3 a3=0 items=0 ppid=1 pid=7585 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=110 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:57:14.134615 kernel: audit: type=1006 audit(1769216234.111:1662): pid=7585 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=110 res=1 Jan 24 00:57:14.134705 kernel: audit: type=1300 audit(1769216234.111:1662): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffeec65e910 a2=3 a3=0 items=0 ppid=1 pid=7585 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=110 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:57:14.111000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:57:14.144119 kernel: audit: type=1327 audit(1769216234.111:1662): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:57:14.149362 systemd-logind[1656]: New session 110 of user core. Jan 24 00:57:14.154575 systemd[1]: Started session-110.scope - Session 110 of User core. Jan 24 00:57:14.163000 audit[7585]: USER_START pid=7585 uid=0 auid=500 ses=110 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:14.172159 kernel: audit: type=1105 audit(1769216234.163:1663): pid=7585 uid=0 auid=500 ses=110 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:14.172000 audit[7589]: CRED_ACQ pid=7589 uid=0 auid=500 ses=110 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:14.180126 kernel: audit: type=1103 audit(1769216234.172:1664): pid=7589 uid=0 auid=500 ses=110 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:14.606823 sshd[7589]: Connection closed by 4.153.228.146 port 47668 Jan 24 00:57:14.607176 sshd-session[7585]: pam_unix(sshd:session): session closed for user core Jan 24 00:57:14.609000 audit[7585]: USER_END pid=7585 uid=0 auid=500 ses=110 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:14.615256 systemd[1]: sshd@110-65.109.167.77:22-4.153.228.146:47668.service: Deactivated successfully. Jan 24 00:57:14.618543 kubelet[2863]: E0124 00:57:14.618117 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" podUID="7fb26181-9fdc-4f96-be2c-85fbaa5f21b7" Jan 24 00:57:14.621079 kernel: audit: type=1106 audit(1769216234.609:1665): pid=7585 uid=0 auid=500 ses=110 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:14.621134 kernel: audit: type=1104 audit(1769216234.609:1666): pid=7585 uid=0 auid=500 ses=110 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:14.609000 audit[7585]: CRED_DISP pid=7585 uid=0 auid=500 ses=110 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:14.620571 systemd[1]: session-110.scope: Deactivated successfully. Jan 24 00:57:14.612000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@110-65.109.167.77:22-4.153.228.146:47668 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:57:14.626393 systemd-logind[1656]: Session 110 logged out. Waiting for processes to exit. Jan 24 00:57:14.628121 systemd-logind[1656]: Removed session 110. Jan 24 00:57:19.622664 kubelet[2863]: E0124 00:57:19.622390 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:57:19.624377 kubelet[2863]: E0124 00:57:19.622658 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58c54c478-wd6fd" podUID="dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5" Jan 24 00:57:19.744534 systemd[1]: Started sshd@111-65.109.167.77:22-4.153.228.146:46264.service - OpenSSH per-connection server daemon (4.153.228.146:46264). Jan 24 00:57:19.743000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@111-65.109.167.77:22-4.153.228.146:46264 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:57:19.746708 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:57:19.746800 kernel: audit: type=1130 audit(1769216239.743:1668): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@111-65.109.167.77:22-4.153.228.146:46264 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:57:20.439383 sshd[7601]: Accepted publickey for core from 4.153.228.146 port 46264 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:57:20.438000 audit[7601]: USER_ACCT pid=7601 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:20.447137 kernel: audit: type=1101 audit(1769216240.438:1669): pid=7601 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:20.445000 audit[7601]: CRED_ACQ pid=7601 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:20.448201 sshd-session[7601]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:57:20.455092 kernel: audit: type=1103 audit(1769216240.445:1670): pid=7601 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:20.463473 kernel: audit: type=1006 audit(1769216240.446:1671): pid=7601 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=111 res=1 Jan 24 00:57:20.446000 audit[7601]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffcff8689b0 a2=3 a3=0 items=0 ppid=1 pid=7601 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=111 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:57:20.470617 kernel: audit: type=1300 audit(1769216240.446:1671): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffcff8689b0 a2=3 a3=0 items=0 ppid=1 pid=7601 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=111 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:57:20.446000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:57:20.474083 kernel: audit: type=1327 audit(1769216240.446:1671): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:57:20.476207 systemd-logind[1656]: New session 111 of user core. Jan 24 00:57:20.479300 systemd[1]: Started session-111.scope - Session 111 of User core. Jan 24 00:57:20.491528 kernel: audit: type=1105 audit(1769216240.483:1672): pid=7601 uid=0 auid=500 ses=111 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:20.483000 audit[7601]: USER_START pid=7601 uid=0 auid=500 ses=111 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:20.490000 audit[7619]: CRED_ACQ pid=7619 uid=0 auid=500 ses=111 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:20.498099 kernel: audit: type=1103 audit(1769216240.490:1673): pid=7619 uid=0 auid=500 ses=111 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:20.874093 sshd[7619]: Connection closed by 4.153.228.146 port 46264 Jan 24 00:57:20.875903 sshd-session[7601]: pam_unix(sshd:session): session closed for user core Jan 24 00:57:20.876000 audit[7601]: USER_END pid=7601 uid=0 auid=500 ses=111 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:20.881014 systemd[1]: sshd@111-65.109.167.77:22-4.153.228.146:46264.service: Deactivated successfully. Jan 24 00:57:20.885764 systemd-logind[1656]: Session 111 logged out. Waiting for processes to exit. Jan 24 00:57:20.887550 kernel: audit: type=1106 audit(1769216240.876:1674): pid=7601 uid=0 auid=500 ses=111 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:20.887612 kernel: audit: type=1104 audit(1769216240.876:1675): pid=7601 uid=0 auid=500 ses=111 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:20.876000 audit[7601]: CRED_DISP pid=7601 uid=0 auid=500 ses=111 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:20.889610 systemd[1]: session-111.scope: Deactivated successfully. Jan 24 00:57:20.879000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@111-65.109.167.77:22-4.153.228.146:46264 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:57:20.897506 systemd-logind[1656]: Removed session 111. Jan 24 00:57:21.620785 kubelet[2863]: E0124 00:57:21.620709 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qhldj" podUID="d0cc0ca8-4b85-478b-b9c2-c61b42d93c89" Jan 24 00:57:22.617678 kubelet[2863]: E0124 00:57:22.617581 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" podUID="d83d59e3-6296-40d9-bb63-5a69b654ac0c" Jan 24 00:57:23.618591 kubelet[2863]: E0124 00:57:23.618464 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:57:26.015818 systemd[1]: Started sshd@112-65.109.167.77:22-4.153.228.146:60436.service - OpenSSH per-connection server daemon (4.153.228.146:60436). Jan 24 00:57:26.025043 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:57:26.025158 kernel: audit: type=1130 audit(1769216246.015:1677): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@112-65.109.167.77:22-4.153.228.146:60436 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:57:26.015000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@112-65.109.167.77:22-4.153.228.146:60436 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:57:26.711000 audit[7633]: USER_ACCT pid=7633 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:26.713341 sshd[7633]: Accepted publickey for core from 4.153.228.146 port 60436 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:57:26.716160 sshd-session[7633]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:57:26.714000 audit[7633]: CRED_ACQ pid=7633 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:26.720703 kernel: audit: type=1101 audit(1769216246.711:1678): pid=7633 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:26.720761 kernel: audit: type=1103 audit(1769216246.714:1679): pid=7633 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:26.728867 systemd-logind[1656]: New session 112 of user core. Jan 24 00:57:26.714000 audit[7633]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffff1b6f7d0 a2=3 a3=0 items=0 ppid=1 pid=7633 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=112 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:57:26.736114 kernel: audit: type=1006 audit(1769216246.714:1680): pid=7633 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=112 res=1 Jan 24 00:57:26.736160 kernel: audit: type=1300 audit(1769216246.714:1680): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffff1b6f7d0 a2=3 a3=0 items=0 ppid=1 pid=7633 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=112 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:57:26.740739 systemd[1]: Started session-112.scope - Session 112 of User core. Jan 24 00:57:26.714000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:57:26.746161 kernel: audit: type=1327 audit(1769216246.714:1680): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:57:26.746000 audit[7633]: USER_START pid=7633 uid=0 auid=500 ses=112 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:26.755124 kernel: audit: type=1105 audit(1769216246.746:1681): pid=7633 uid=0 auid=500 ses=112 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:26.755000 audit[7637]: CRED_ACQ pid=7637 uid=0 auid=500 ses=112 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:26.762095 kernel: audit: type=1103 audit(1769216246.755:1682): pid=7637 uid=0 auid=500 ses=112 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:27.210083 sshd[7637]: Connection closed by 4.153.228.146 port 60436 Jan 24 00:57:27.212370 sshd-session[7633]: pam_unix(sshd:session): session closed for user core Jan 24 00:57:27.214000 audit[7633]: USER_END pid=7633 uid=0 auid=500 ses=112 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:27.228084 kernel: audit: type=1106 audit(1769216247.214:1683): pid=7633 uid=0 auid=500 ses=112 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:27.228160 kernel: audit: type=1104 audit(1769216247.217:1684): pid=7633 uid=0 auid=500 ses=112 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:27.217000 audit[7633]: CRED_DISP pid=7633 uid=0 auid=500 ses=112 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:27.225380 systemd[1]: sshd@112-65.109.167.77:22-4.153.228.146:60436.service: Deactivated successfully. Jan 24 00:57:27.224000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@112-65.109.167.77:22-4.153.228.146:60436 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:57:27.231603 systemd[1]: session-112.scope: Deactivated successfully. Jan 24 00:57:27.236698 systemd-logind[1656]: Session 112 logged out. Waiting for processes to exit. Jan 24 00:57:27.239567 systemd-logind[1656]: Removed session 112. Jan 24 00:57:29.619304 kubelet[2863]: E0124 00:57:29.618355 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" podUID="7fb26181-9fdc-4f96-be2c-85fbaa5f21b7" Jan 24 00:57:32.365140 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:57:32.365450 kernel: audit: type=1130 audit(1769216252.349:1686): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@113-65.109.167.77:22-4.153.228.146:60444 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:57:32.349000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@113-65.109.167.77:22-4.153.228.146:60444 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:57:32.350305 systemd[1]: Started sshd@113-65.109.167.77:22-4.153.228.146:60444.service - OpenSSH per-connection server daemon (4.153.228.146:60444). Jan 24 00:57:33.041000 audit[7658]: USER_ACCT pid=7658 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:33.048400 sshd[7658]: Accepted publickey for core from 4.153.228.146 port 60444 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:57:33.049225 kernel: audit: type=1101 audit(1769216253.041:1687): pid=7658 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:33.050674 sshd-session[7658]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:57:33.048000 audit[7658]: CRED_ACQ pid=7658 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:33.057117 kernel: audit: type=1103 audit(1769216253.048:1688): pid=7658 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:33.048000 audit[7658]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff628b7b60 a2=3 a3=0 items=0 ppid=1 pid=7658 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=113 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:57:33.065198 kernel: audit: type=1006 audit(1769216253.048:1689): pid=7658 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=113 res=1 Jan 24 00:57:33.065272 kernel: audit: type=1300 audit(1769216253.048:1689): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff628b7b60 a2=3 a3=0 items=0 ppid=1 pid=7658 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=113 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:57:33.048000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:57:33.071401 kernel: audit: type=1327 audit(1769216253.048:1689): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:57:33.077344 systemd-logind[1656]: New session 113 of user core. Jan 24 00:57:33.084212 systemd[1]: Started session-113.scope - Session 113 of User core. Jan 24 00:57:33.088000 audit[7658]: USER_START pid=7658 uid=0 auid=500 ses=113 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:33.096000 audit[7688]: CRED_ACQ pid=7688 uid=0 auid=500 ses=113 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:33.099923 kernel: audit: type=1105 audit(1769216253.088:1690): pid=7658 uid=0 auid=500 ses=113 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:33.099987 kernel: audit: type=1103 audit(1769216253.096:1691): pid=7688 uid=0 auid=500 ses=113 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:33.462086 sshd[7688]: Connection closed by 4.153.228.146 port 60444 Jan 24 00:57:33.462616 sshd-session[7658]: pam_unix(sshd:session): session closed for user core Jan 24 00:57:33.462000 audit[7658]: USER_END pid=7658 uid=0 auid=500 ses=113 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:33.467248 systemd[1]: sshd@113-65.109.167.77:22-4.153.228.146:60444.service: Deactivated successfully. Jan 24 00:57:33.468817 systemd[1]: session-113.scope: Deactivated successfully. Jan 24 00:57:33.469631 systemd-logind[1656]: Session 113 logged out. Waiting for processes to exit. Jan 24 00:57:33.472187 kernel: audit: type=1106 audit(1769216253.462:1692): pid=7658 uid=0 auid=500 ses=113 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:33.473891 systemd-logind[1656]: Removed session 113. Jan 24 00:57:33.462000 audit[7658]: CRED_DISP pid=7658 uid=0 auid=500 ses=113 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:33.465000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@113-65.109.167.77:22-4.153.228.146:60444 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:57:33.482120 kernel: audit: type=1104 audit(1769216253.462:1693): pid=7658 uid=0 auid=500 ses=113 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:34.618047 kubelet[2863]: E0124 00:57:34.617876 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qhldj" podUID="d0cc0ca8-4b85-478b-b9c2-c61b42d93c89" Jan 24 00:57:34.618887 kubelet[2863]: E0124 00:57:34.618570 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58c54c478-wd6fd" podUID="dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5" Jan 24 00:57:34.619542 kubelet[2863]: E0124 00:57:34.619246 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:57:35.617913 kubelet[2863]: E0124 00:57:35.617585 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:57:36.615881 kubelet[2863]: E0124 00:57:36.615710 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" podUID="d83d59e3-6296-40d9-bb63-5a69b654ac0c" Jan 24 00:57:38.600865 systemd[1]: Started sshd@114-65.109.167.77:22-4.153.228.146:50174.service - OpenSSH per-connection server daemon (4.153.228.146:50174). Jan 24 00:57:38.600000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@114-65.109.167.77:22-4.153.228.146:50174 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:57:38.603968 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:57:38.604043 kernel: audit: type=1130 audit(1769216258.600:1695): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@114-65.109.167.77:22-4.153.228.146:50174 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:57:39.273000 audit[7699]: USER_ACCT pid=7699 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:39.278138 sshd[7699]: Accepted publickey for core from 4.153.228.146 port 50174 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:57:39.280521 sshd-session[7699]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:57:39.290155 kernel: audit: type=1101 audit(1769216259.273:1696): pid=7699 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:39.273000 audit[7699]: CRED_ACQ pid=7699 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:39.304292 kernel: audit: type=1103 audit(1769216259.273:1697): pid=7699 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:39.317146 kernel: audit: type=1006 audit(1769216259.273:1698): pid=7699 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=114 res=1 Jan 24 00:57:39.273000 audit[7699]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe35530860 a2=3 a3=0 items=0 ppid=1 pid=7699 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=114 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:57:39.323445 systemd-logind[1656]: New session 114 of user core. Jan 24 00:57:39.332634 kernel: audit: type=1300 audit(1769216259.273:1698): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe35530860 a2=3 a3=0 items=0 ppid=1 pid=7699 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=114 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:57:39.332709 kernel: audit: type=1327 audit(1769216259.273:1698): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:57:39.273000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:57:39.343311 systemd[1]: Started session-114.scope - Session 114 of User core. Jan 24 00:57:39.347000 audit[7699]: USER_START pid=7699 uid=0 auid=500 ses=114 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:39.364338 kernel: audit: type=1105 audit(1769216259.347:1699): pid=7699 uid=0 auid=500 ses=114 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:39.362000 audit[7703]: CRED_ACQ pid=7703 uid=0 auid=500 ses=114 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:39.380126 kernel: audit: type=1103 audit(1769216259.362:1700): pid=7703 uid=0 auid=500 ses=114 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:39.703078 sshd[7703]: Connection closed by 4.153.228.146 port 50174 Jan 24 00:57:39.703536 sshd-session[7699]: pam_unix(sshd:session): session closed for user core Jan 24 00:57:39.706000 audit[7699]: USER_END pid=7699 uid=0 auid=500 ses=114 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:39.717109 kernel: audit: type=1106 audit(1769216259.706:1701): pid=7699 uid=0 auid=500 ses=114 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:39.720279 systemd-logind[1656]: Session 114 logged out. Waiting for processes to exit. Jan 24 00:57:39.720856 systemd[1]: sshd@114-65.109.167.77:22-4.153.228.146:50174.service: Deactivated successfully. Jan 24 00:57:39.706000 audit[7699]: CRED_DISP pid=7699 uid=0 auid=500 ses=114 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:39.728573 kernel: audit: type=1104 audit(1769216259.706:1702): pid=7699 uid=0 auid=500 ses=114 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:39.729351 systemd[1]: session-114.scope: Deactivated successfully. Jan 24 00:57:39.733468 systemd-logind[1656]: Removed session 114. Jan 24 00:57:39.721000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@114-65.109.167.77:22-4.153.228.146:50174 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:57:41.617239 kubelet[2863]: E0124 00:57:41.616721 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" podUID="7fb26181-9fdc-4f96-be2c-85fbaa5f21b7" Jan 24 00:57:44.847477 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:57:44.847655 kernel: audit: type=1130 audit(1769216264.839:1704): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@115-65.109.167.77:22-4.153.228.146:55880 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:57:44.839000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@115-65.109.167.77:22-4.153.228.146:55880 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:57:44.840156 systemd[1]: Started sshd@115-65.109.167.77:22-4.153.228.146:55880.service - OpenSSH per-connection server daemon (4.153.228.146:55880). Jan 24 00:57:45.546944 sshd[7715]: Accepted publickey for core from 4.153.228.146 port 55880 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:57:45.545000 audit[7715]: USER_ACCT pid=7715 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:45.552554 sshd-session[7715]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:57:45.548000 audit[7715]: CRED_ACQ pid=7715 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:45.556277 kernel: audit: type=1101 audit(1769216265.545:1705): pid=7715 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:45.556457 kernel: audit: type=1103 audit(1769216265.548:1706): pid=7715 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:45.560204 kernel: audit: type=1006 audit(1769216265.548:1707): pid=7715 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=115 res=1 Jan 24 00:57:45.565300 kernel: audit: type=1300 audit(1769216265.548:1707): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc084f9e50 a2=3 a3=0 items=0 ppid=1 pid=7715 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=115 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:57:45.548000 audit[7715]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc084f9e50 a2=3 a3=0 items=0 ppid=1 pid=7715 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=115 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:57:45.566387 systemd-logind[1656]: New session 115 of user core. Jan 24 00:57:45.548000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:57:45.574590 kernel: audit: type=1327 audit(1769216265.548:1707): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:57:45.577392 systemd[1]: Started session-115.scope - Session 115 of User core. Jan 24 00:57:45.596142 kernel: audit: type=1105 audit(1769216265.587:1708): pid=7715 uid=0 auid=500 ses=115 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:45.587000 audit[7715]: USER_START pid=7715 uid=0 auid=500 ses=115 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:45.595000 audit[7719]: CRED_ACQ pid=7719 uid=0 auid=500 ses=115 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:45.609119 kernel: audit: type=1103 audit(1769216265.595:1709): pid=7719 uid=0 auid=500 ses=115 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:45.618411 kubelet[2863]: E0124 00:57:45.618372 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:57:46.023244 sshd[7719]: Connection closed by 4.153.228.146 port 55880 Jan 24 00:57:46.025080 sshd-session[7715]: pam_unix(sshd:session): session closed for user core Jan 24 00:57:46.025000 audit[7715]: USER_END pid=7715 uid=0 auid=500 ses=115 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:46.025000 audit[7715]: CRED_DISP pid=7715 uid=0 auid=500 ses=115 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:46.029417 systemd[1]: sshd@115-65.109.167.77:22-4.153.228.146:55880.service: Deactivated successfully. Jan 24 00:57:46.030959 systemd[1]: session-115.scope: Deactivated successfully. Jan 24 00:57:46.035667 systemd-logind[1656]: Session 115 logged out. Waiting for processes to exit. Jan 24 00:57:46.037301 kernel: audit: type=1106 audit(1769216266.025:1710): pid=7715 uid=0 auid=500 ses=115 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:46.037359 kernel: audit: type=1104 audit(1769216266.025:1711): pid=7715 uid=0 auid=500 ses=115 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:46.037745 systemd-logind[1656]: Removed session 115. Jan 24 00:57:46.028000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@115-65.109.167.77:22-4.153.228.146:55880 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:57:46.618123 kubelet[2863]: E0124 00:57:46.616821 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qhldj" podUID="d0cc0ca8-4b85-478b-b9c2-c61b42d93c89" Jan 24 00:57:48.617042 kubelet[2863]: E0124 00:57:48.616965 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58c54c478-wd6fd" podUID="dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5" Jan 24 00:57:49.617517 kubelet[2863]: E0124 00:57:49.617126 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:57:51.157000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@116-65.109.167.77:22-4.153.228.146:55896 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:57:51.158584 systemd[1]: Started sshd@116-65.109.167.77:22-4.153.228.146:55896.service - OpenSSH per-connection server daemon (4.153.228.146:55896). Jan 24 00:57:51.160458 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:57:51.160487 kernel: audit: type=1130 audit(1769216271.157:1713): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@116-65.109.167.77:22-4.153.228.146:55896 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:57:51.617291 kubelet[2863]: E0124 00:57:51.617096 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" podUID="d83d59e3-6296-40d9-bb63-5a69b654ac0c" Jan 24 00:57:51.834000 audit[7731]: USER_ACCT pid=7731 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:51.851360 kernel: audit: type=1101 audit(1769216271.834:1714): pid=7731 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:51.854216 sshd[7731]: Accepted publickey for core from 4.153.228.146 port 55896 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:57:51.851000 audit[7731]: CRED_ACQ pid=7731 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:51.856293 sshd-session[7731]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:57:51.878616 kernel: audit: type=1103 audit(1769216271.851:1715): pid=7731 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:51.878736 kernel: audit: type=1006 audit(1769216271.852:1716): pid=7731 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=116 res=1 Jan 24 00:57:51.876566 systemd-logind[1656]: New session 116 of user core. Jan 24 00:57:51.852000 audit[7731]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff94737350 a2=3 a3=0 items=0 ppid=1 pid=7731 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=116 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:57:51.899200 kernel: audit: type=1300 audit(1769216271.852:1716): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff94737350 a2=3 a3=0 items=0 ppid=1 pid=7731 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=116 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:57:51.899341 kernel: audit: type=1327 audit(1769216271.852:1716): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:57:51.852000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:57:51.900905 systemd[1]: Started session-116.scope - Session 116 of User core. Jan 24 00:57:51.910000 audit[7731]: USER_START pid=7731 uid=0 auid=500 ses=116 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:51.926135 kernel: audit: type=1105 audit(1769216271.910:1717): pid=7731 uid=0 auid=500 ses=116 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:51.914000 audit[7735]: CRED_ACQ pid=7735 uid=0 auid=500 ses=116 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:51.938264 kernel: audit: type=1103 audit(1769216271.914:1718): pid=7735 uid=0 auid=500 ses=116 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:52.334199 sshd[7735]: Connection closed by 4.153.228.146 port 55896 Jan 24 00:57:52.336739 sshd-session[7731]: pam_unix(sshd:session): session closed for user core Jan 24 00:57:52.337000 audit[7731]: USER_END pid=7731 uid=0 auid=500 ses=116 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:52.349083 kernel: audit: type=1106 audit(1769216272.337:1719): pid=7731 uid=0 auid=500 ses=116 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:52.348499 systemd[1]: sshd@116-65.109.167.77:22-4.153.228.146:55896.service: Deactivated successfully. Jan 24 00:57:52.337000 audit[7731]: CRED_DISP pid=7731 uid=0 auid=500 ses=116 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:52.352627 systemd[1]: session-116.scope: Deactivated successfully. Jan 24 00:57:52.353580 systemd-logind[1656]: Session 116 logged out. Waiting for processes to exit. Jan 24 00:57:52.356697 systemd-logind[1656]: Removed session 116. Jan 24 00:57:52.358365 kernel: audit: type=1104 audit(1769216272.337:1720): pid=7731 uid=0 auid=500 ses=116 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:52.347000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@116-65.109.167.77:22-4.153.228.146:55896 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:57:53.618771 kubelet[2863]: E0124 00:57:53.618691 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" podUID="7fb26181-9fdc-4f96-be2c-85fbaa5f21b7" Jan 24 00:57:57.467173 systemd[1]: Started sshd@117-65.109.167.77:22-4.153.228.146:58280.service - OpenSSH per-connection server daemon (4.153.228.146:58280). Jan 24 00:57:57.475542 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:57:57.475611 kernel: audit: type=1130 audit(1769216277.466:1722): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@117-65.109.167.77:22-4.153.228.146:58280 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:57:57.466000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@117-65.109.167.77:22-4.153.228.146:58280 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:57:57.617838 kubelet[2863]: E0124 00:57:57.617500 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:57:58.130000 audit[7746]: USER_ACCT pid=7746 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:58.133264 sshd-session[7746]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:57:58.134661 sshd[7746]: Accepted publickey for core from 4.153.228.146 port 58280 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:57:58.146127 kernel: audit: type=1101 audit(1769216278.130:1723): pid=7746 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:58.146189 kernel: audit: type=1103 audit(1769216278.131:1724): pid=7746 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:58.131000 audit[7746]: CRED_ACQ pid=7746 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:58.146469 systemd-logind[1656]: New session 117 of user core. Jan 24 00:57:58.160412 systemd[1]: Started session-117.scope - Session 117 of User core. Jan 24 00:57:58.161932 kernel: audit: type=1006 audit(1769216278.131:1725): pid=7746 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=117 res=1 Jan 24 00:57:58.169416 kernel: audit: type=1300 audit(1769216278.131:1725): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff8e3da360 a2=3 a3=0 items=0 ppid=1 pid=7746 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=117 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:57:58.131000 audit[7746]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff8e3da360 a2=3 a3=0 items=0 ppid=1 pid=7746 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=117 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:57:58.131000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:57:58.183879 kernel: audit: type=1327 audit(1769216278.131:1725): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:57:58.187661 kernel: audit: type=1105 audit(1769216278.182:1726): pid=7746 uid=0 auid=500 ses=117 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:58.182000 audit[7746]: USER_START pid=7746 uid=0 auid=500 ses=117 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:58.200000 audit[7750]: CRED_ACQ pid=7750 uid=0 auid=500 ses=117 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:58.208101 kernel: audit: type=1103 audit(1769216278.200:1727): pid=7750 uid=0 auid=500 ses=117 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:58.593093 sshd[7750]: Connection closed by 4.153.228.146 port 58280 Jan 24 00:57:58.593763 sshd-session[7746]: pam_unix(sshd:session): session closed for user core Jan 24 00:57:58.599000 audit[7746]: USER_END pid=7746 uid=0 auid=500 ses=117 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:58.619113 kernel: audit: type=1106 audit(1769216278.599:1728): pid=7746 uid=0 auid=500 ses=117 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:58.622903 systemd[1]: sshd@117-65.109.167.77:22-4.153.228.146:58280.service: Deactivated successfully. Jan 24 00:57:58.599000 audit[7746]: CRED_DISP pid=7746 uid=0 auid=500 ses=117 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:57:58.626666 kubelet[2863]: E0124 00:57:58.623968 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qhldj" podUID="d0cc0ca8-4b85-478b-b9c2-c61b42d93c89" Jan 24 00:57:58.627829 systemd[1]: session-117.scope: Deactivated successfully. Jan 24 00:57:58.631737 systemd-logind[1656]: Session 117 logged out. Waiting for processes to exit. Jan 24 00:57:58.632664 systemd-logind[1656]: Removed session 117. Jan 24 00:57:58.624000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@117-65.109.167.77:22-4.153.228.146:58280 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:57:58.637414 kernel: audit: type=1104 audit(1769216278.599:1729): pid=7746 uid=0 auid=500 ses=117 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:58:02.616827 kubelet[2863]: E0124 00:58:02.616708 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:58:03.620437 kubelet[2863]: E0124 00:58:03.620159 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58c54c478-wd6fd" podUID="dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5" Jan 24 00:58:03.728917 systemd[1]: Started sshd@118-65.109.167.77:22-4.153.228.146:58290.service - OpenSSH per-connection server daemon (4.153.228.146:58290). Jan 24 00:58:03.734843 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:58:03.734881 kernel: audit: type=1130 audit(1769216283.728:1731): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@118-65.109.167.77:22-4.153.228.146:58290 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:58:03.728000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@118-65.109.167.77:22-4.153.228.146:58290 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:58:04.419000 audit[7789]: USER_ACCT pid=7789 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:58:04.424552 sshd-session[7789]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:58:04.428207 sshd[7789]: Accepted publickey for core from 4.153.228.146 port 58290 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:58:04.421000 audit[7789]: CRED_ACQ pid=7789 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:58:04.437269 kernel: audit: type=1101 audit(1769216284.419:1732): pid=7789 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:58:04.437351 kernel: audit: type=1103 audit(1769216284.421:1733): pid=7789 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:58:04.446316 systemd-logind[1656]: New session 118 of user core. Jan 24 00:58:04.450738 kernel: audit: type=1006 audit(1769216284.421:1734): pid=7789 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=118 res=1 Jan 24 00:58:04.421000 audit[7789]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc05fd8430 a2=3 a3=0 items=0 ppid=1 pid=7789 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=118 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:58:04.459096 kernel: audit: type=1300 audit(1769216284.421:1734): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc05fd8430 a2=3 a3=0 items=0 ppid=1 pid=7789 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=118 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:58:04.460775 systemd[1]: Started session-118.scope - Session 118 of User core. Jan 24 00:58:04.421000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:58:04.475287 kernel: audit: type=1327 audit(1769216284.421:1734): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:58:04.475366 kernel: audit: type=1105 audit(1769216284.472:1735): pid=7789 uid=0 auid=500 ses=118 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:58:04.472000 audit[7789]: USER_START pid=7789 uid=0 auid=500 ses=118 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:58:04.488000 audit[7793]: CRED_ACQ pid=7793 uid=0 auid=500 ses=118 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:58:04.500179 kernel: audit: type=1103 audit(1769216284.488:1736): pid=7793 uid=0 auid=500 ses=118 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:58:04.616923 kubelet[2863]: E0124 00:58:04.616630 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" podUID="7fb26181-9fdc-4f96-be2c-85fbaa5f21b7" Jan 24 00:58:04.617747 kubelet[2863]: E0124 00:58:04.617687 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" podUID="d83d59e3-6296-40d9-bb63-5a69b654ac0c" Jan 24 00:58:04.891126 sshd[7793]: Connection closed by 4.153.228.146 port 58290 Jan 24 00:58:04.892368 sshd-session[7789]: pam_unix(sshd:session): session closed for user core Jan 24 00:58:04.899000 audit[7789]: USER_END pid=7789 uid=0 auid=500 ses=118 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:58:04.900000 audit[7789]: CRED_DISP pid=7789 uid=0 auid=500 ses=118 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:58:04.920585 systemd[1]: sshd@118-65.109.167.77:22-4.153.228.146:58290.service: Deactivated successfully. Jan 24 00:58:04.921859 kernel: audit: type=1106 audit(1769216284.899:1737): pid=7789 uid=0 auid=500 ses=118 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:58:04.921888 kernel: audit: type=1104 audit(1769216284.900:1738): pid=7789 uid=0 auid=500 ses=118 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:58:04.929476 systemd[1]: session-118.scope: Deactivated successfully. Jan 24 00:58:04.921000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@118-65.109.167.77:22-4.153.228.146:58290 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:58:04.935268 systemd-logind[1656]: Session 118 logged out. Waiting for processes to exit. Jan 24 00:58:04.937207 systemd-logind[1656]: Removed session 118. Jan 24 00:58:09.621467 kubelet[2863]: E0124 00:58:09.621394 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:58:09.622418 kubelet[2863]: E0124 00:58:09.621873 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qhldj" podUID="d0cc0ca8-4b85-478b-b9c2-c61b42d93c89" Jan 24 00:58:10.041180 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:58:10.041312 kernel: audit: type=1130 audit(1769216290.026:1740): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@119-65.109.167.77:22-4.153.228.146:48530 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:58:10.026000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@119-65.109.167.77:22-4.153.228.146:48530 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:58:10.027284 systemd[1]: Started sshd@119-65.109.167.77:22-4.153.228.146:48530.service - OpenSSH per-connection server daemon (4.153.228.146:48530). Jan 24 00:58:10.699000 audit[7805]: USER_ACCT pid=7805 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:58:10.705435 sshd-session[7805]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:58:10.719884 kernel: audit: type=1101 audit(1769216290.699:1741): pid=7805 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:58:10.719949 sshd[7805]: Accepted publickey for core from 4.153.228.146 port 48530 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:58:10.700000 audit[7805]: CRED_ACQ pid=7805 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:58:10.738113 kernel: audit: type=1103 audit(1769216290.700:1742): pid=7805 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:58:10.734865 systemd-logind[1656]: New session 119 of user core. Jan 24 00:58:10.742289 systemd[1]: Started session-119.scope - Session 119 of User core. Jan 24 00:58:10.754783 kernel: audit: type=1006 audit(1769216290.700:1743): pid=7805 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=119 res=1 Jan 24 00:58:10.770230 kernel: audit: type=1300 audit(1769216290.700:1743): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd89fa26a0 a2=3 a3=0 items=0 ppid=1 pid=7805 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=119 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:58:10.700000 audit[7805]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd89fa26a0 a2=3 a3=0 items=0 ppid=1 pid=7805 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=119 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:58:10.700000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:58:10.776222 kernel: audit: type=1327 audit(1769216290.700:1743): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:58:10.758000 audit[7805]: USER_START pid=7805 uid=0 auid=500 ses=119 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:58:10.787079 kernel: audit: type=1105 audit(1769216290.758:1744): pid=7805 uid=0 auid=500 ses=119 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:58:10.769000 audit[7809]: CRED_ACQ pid=7809 uid=0 auid=500 ses=119 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:58:10.795095 kernel: audit: type=1103 audit(1769216290.769:1745): pid=7809 uid=0 auid=500 ses=119 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:58:11.160195 sshd[7809]: Connection closed by 4.153.228.146 port 48530 Jan 24 00:58:11.162349 sshd-session[7805]: pam_unix(sshd:session): session closed for user core Jan 24 00:58:11.163000 audit[7805]: USER_END pid=7805 uid=0 auid=500 ses=119 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:58:11.170764 systemd-logind[1656]: Session 119 logged out. Waiting for processes to exit. Jan 24 00:58:11.173715 systemd[1]: sshd@119-65.109.167.77:22-4.153.228.146:48530.service: Deactivated successfully. Jan 24 00:58:11.179931 systemd[1]: session-119.scope: Deactivated successfully. Jan 24 00:58:11.181123 kernel: audit: type=1106 audit(1769216291.163:1746): pid=7805 uid=0 auid=500 ses=119 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:58:11.163000 audit[7805]: CRED_DISP pid=7805 uid=0 auid=500 ses=119 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:58:11.186756 systemd-logind[1656]: Removed session 119. Jan 24 00:58:11.194175 kernel: audit: type=1104 audit(1769216291.163:1747): pid=7805 uid=0 auid=500 ses=119 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:58:11.173000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@119-65.109.167.77:22-4.153.228.146:48530 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:58:13.617159 kubelet[2863]: E0124 00:58:13.616541 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:58:14.618906 kubelet[2863]: E0124 00:58:14.618837 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58c54c478-wd6fd" podUID="dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5" Jan 24 00:58:16.293290 systemd[1]: Started sshd@120-65.109.167.77:22-4.153.228.146:43468.service - OpenSSH per-connection server daemon (4.153.228.146:43468). Jan 24 00:58:16.292000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@120-65.109.167.77:22-4.153.228.146:43468 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:58:16.294555 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:58:16.294589 kernel: audit: type=1130 audit(1769216296.292:1749): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@120-65.109.167.77:22-4.153.228.146:43468 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:58:16.956000 audit[7820]: USER_ACCT pid=7820 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:58:16.962220 sshd-session[7820]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 24 00:58:16.964850 sshd[7820]: Accepted publickey for core from 4.153.228.146 port 43468 ssh2: RSA SHA256:BIHzTJCHw/I+EBJbR3itWbFHPm6CQGUUdrKFO8BxY6c Jan 24 00:58:16.973278 kernel: audit: type=1101 audit(1769216296.956:1750): pid=7820 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:58:16.986144 kernel: audit: type=1103 audit(1769216296.956:1751): pid=7820 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:58:16.956000 audit[7820]: CRED_ACQ pid=7820 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:58:16.982232 systemd-logind[1656]: New session 120 of user core. Jan 24 00:58:16.956000 audit[7820]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fffe17e59c0 a2=3 a3=0 items=0 ppid=1 pid=7820 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=120 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:58:16.998782 kernel: audit: type=1006 audit(1769216296.956:1752): pid=7820 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=120 res=1 Jan 24 00:58:16.998858 kernel: audit: type=1300 audit(1769216296.956:1752): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fffe17e59c0 a2=3 a3=0 items=0 ppid=1 pid=7820 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=120 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:58:16.956000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:58:17.011118 kernel: audit: type=1327 audit(1769216296.956:1752): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 24 00:58:17.012344 systemd[1]: Started session-120.scope - Session 120 of User core. Jan 24 00:58:17.018000 audit[7820]: USER_START pid=7820 uid=0 auid=500 ses=120 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:58:17.022000 audit[7824]: CRED_ACQ pid=7824 uid=0 auid=500 ses=120 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:58:17.035006 kernel: audit: type=1105 audit(1769216297.018:1753): pid=7820 uid=0 auid=500 ses=120 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:58:17.035223 kernel: audit: type=1103 audit(1769216297.022:1754): pid=7824 uid=0 auid=500 ses=120 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:58:17.465096 sshd[7824]: Connection closed by 4.153.228.146 port 43468 Jan 24 00:58:17.467794 sshd-session[7820]: pam_unix(sshd:session): session closed for user core Jan 24 00:58:17.472000 audit[7820]: USER_END pid=7820 uid=0 auid=500 ses=120 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:58:17.492203 kernel: audit: type=1106 audit(1769216297.472:1755): pid=7820 uid=0 auid=500 ses=120 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:58:17.472000 audit[7820]: CRED_DISP pid=7820 uid=0 auid=500 ses=120 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:58:17.496442 systemd[1]: sshd@120-65.109.167.77:22-4.153.228.146:43468.service: Deactivated successfully. Jan 24 00:58:17.502131 systemd[1]: session-120.scope: Deactivated successfully. Jan 24 00:58:17.503533 kernel: audit: type=1104 audit(1769216297.472:1756): pid=7820 uid=0 auid=500 ses=120 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 24 00:58:17.504941 systemd-logind[1656]: Session 120 logged out. Waiting for processes to exit. Jan 24 00:58:17.494000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@120-65.109.167.77:22-4.153.228.146:43468 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 24 00:58:17.511662 systemd-logind[1656]: Removed session 120. Jan 24 00:58:17.619298 kubelet[2863]: E0124 00:58:17.618821 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" podUID="d83d59e3-6296-40d9-bb63-5a69b654ac0c" Jan 24 00:58:17.622693 kubelet[2863]: E0124 00:58:17.622257 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" podUID="7fb26181-9fdc-4f96-be2c-85fbaa5f21b7" Jan 24 00:58:20.616047 kubelet[2863]: E0124 00:58:20.615972 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qhldj" podUID="d0cc0ca8-4b85-478b-b9c2-c61b42d93c89" Jan 24 00:58:21.618697 kubelet[2863]: E0124 00:58:21.618609 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:58:28.628903 kubelet[2863]: E0124 00:58:28.628521 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:58:28.630991 kubelet[2863]: E0124 00:58:28.630057 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" podUID="d83d59e3-6296-40d9-bb63-5a69b654ac0c" Jan 24 00:58:28.630991 kubelet[2863]: E0124 00:58:28.630725 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58c54c478-wd6fd" podUID="dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5" Jan 24 00:58:29.617266 kubelet[2863]: E0124 00:58:29.617040 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" podUID="7fb26181-9fdc-4f96-be2c-85fbaa5f21b7" Jan 24 00:58:34.617053 kubelet[2863]: E0124 00:58:34.616944 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qhldj" podUID="d0cc0ca8-4b85-478b-b9c2-c61b42d93c89" Jan 24 00:58:34.946161 systemd[1]: cri-containerd-fb20456a76a221b5ce242f0d74a9ba6ac0ba3bc345c3042096c7036a3f9c13ce.scope: Deactivated successfully. Jan 24 00:58:34.948221 systemd[1]: cri-containerd-fb20456a76a221b5ce242f0d74a9ba6ac0ba3bc345c3042096c7036a3f9c13ce.scope: Consumed 1min 58.143s CPU time, 120.7M memory peak. Jan 24 00:58:34.949000 audit: BPF prog-id=146 op=UNLOAD Jan 24 00:58:34.952373 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 24 00:58:34.952535 kernel: audit: type=1334 audit(1769216314.949:1758): prog-id=146 op=UNLOAD Jan 24 00:58:34.958291 containerd[1682]: time="2026-01-24T00:58:34.958216006Z" level=info msg="received container exit event container_id:\"fb20456a76a221b5ce242f0d74a9ba6ac0ba3bc345c3042096c7036a3f9c13ce\" id:\"fb20456a76a221b5ce242f0d74a9ba6ac0ba3bc345c3042096c7036a3f9c13ce\" pid:3188 exit_status:1 exited_at:{seconds:1769216314 nanos:952345409}" Jan 24 00:58:34.949000 audit: BPF prog-id=150 op=UNLOAD Jan 24 00:58:34.964122 kernel: audit: type=1334 audit(1769216314.949:1759): prog-id=150 op=UNLOAD Jan 24 00:58:35.014737 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-fb20456a76a221b5ce242f0d74a9ba6ac0ba3bc345c3042096c7036a3f9c13ce-rootfs.mount: Deactivated successfully. Jan 24 00:58:35.394114 kubelet[2863]: E0124 00:58:35.393371 2863 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:35006->10.0.0.2:2379: read: connection timed out" Jan 24 00:58:35.472155 systemd[1]: cri-containerd-bd71b9b62407b31439937f713f0c85f23cbb38b07cfc69f4ee61abd6c124d8ad.scope: Deactivated successfully. Jan 24 00:58:35.472793 systemd[1]: cri-containerd-bd71b9b62407b31439937f713f0c85f23cbb38b07cfc69f4ee61abd6c124d8ad.scope: Consumed 8.287s CPU time, 66.1M memory peak, 192K read from disk. Jan 24 00:58:35.474000 audit: BPF prog-id=256 op=LOAD Jan 24 00:58:35.480557 containerd[1682]: time="2026-01-24T00:58:35.477253555Z" level=info msg="received container exit event container_id:\"bd71b9b62407b31439937f713f0c85f23cbb38b07cfc69f4ee61abd6c124d8ad\" id:\"bd71b9b62407b31439937f713f0c85f23cbb38b07cfc69f4ee61abd6c124d8ad\" pid:2709 exit_status:1 exited_at:{seconds:1769216315 nanos:476668456}" Jan 24 00:58:35.482022 kernel: audit: type=1334 audit(1769216315.474:1760): prog-id=256 op=LOAD Jan 24 00:58:35.488217 kernel: audit: type=1334 audit(1769216315.479:1761): prog-id=108 op=UNLOAD Jan 24 00:58:35.479000 audit: BPF prog-id=108 op=UNLOAD Jan 24 00:58:35.479000 audit: BPF prog-id=112 op=UNLOAD Jan 24 00:58:35.480000 audit: BPF prog-id=88 op=UNLOAD Jan 24 00:58:35.497057 kernel: audit: type=1334 audit(1769216315.479:1762): prog-id=112 op=UNLOAD Jan 24 00:58:35.497143 kernel: audit: type=1334 audit(1769216315.480:1763): prog-id=88 op=UNLOAD Jan 24 00:58:35.531782 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-bd71b9b62407b31439937f713f0c85f23cbb38b07cfc69f4ee61abd6c124d8ad-rootfs.mount: Deactivated successfully. Jan 24 00:58:35.617937 kubelet[2863]: E0124 00:58:35.617809 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:58:36.011102 kubelet[2863]: I0124 00:58:36.011023 2863 scope.go:117] "RemoveContainer" containerID="fb20456a76a221b5ce242f0d74a9ba6ac0ba3bc345c3042096c7036a3f9c13ce" Jan 24 00:58:36.016218 kubelet[2863]: I0124 00:58:36.016028 2863 scope.go:117] "RemoveContainer" containerID="bd71b9b62407b31439937f713f0c85f23cbb38b07cfc69f4ee61abd6c124d8ad" Jan 24 00:58:36.023958 containerd[1682]: time="2026-01-24T00:58:36.023862219Z" level=info msg="CreateContainer within sandbox \"1e4853a363e27e94a2347401150b690736121ddb4a32eb68c5a0f10638226f15\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Jan 24 00:58:36.028505 containerd[1682]: time="2026-01-24T00:58:36.028430187Z" level=info msg="CreateContainer within sandbox \"c95907dae8b1dd0b335eb404719b5ebdd429e35f13988537f8334ea5c995b8da\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Jan 24 00:58:36.044172 containerd[1682]: time="2026-01-24T00:58:36.043306000Z" level=info msg="Container 3dc97fad75dbcf9230e627dd7935d4cb7b35975c5f6acb5fc8a08087dc48229c: CDI devices from CRI Config.CDIDevices: []" Jan 24 00:58:36.045872 containerd[1682]: time="2026-01-24T00:58:36.045827079Z" level=info msg="Container a7eb67c1e88285f0956e0d6c1782c24cdea9c45452426f0ff13a8a05d37b4cc2: CDI devices from CRI Config.CDIDevices: []" Jan 24 00:58:36.056027 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4008018478.mount: Deactivated successfully. Jan 24 00:58:36.061799 containerd[1682]: time="2026-01-24T00:58:36.061741131Z" level=info msg="CreateContainer within sandbox \"1e4853a363e27e94a2347401150b690736121ddb4a32eb68c5a0f10638226f15\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"3dc97fad75dbcf9230e627dd7935d4cb7b35975c5f6acb5fc8a08087dc48229c\"" Jan 24 00:58:36.062843 containerd[1682]: time="2026-01-24T00:58:36.062803030Z" level=info msg="StartContainer for \"3dc97fad75dbcf9230e627dd7935d4cb7b35975c5f6acb5fc8a08087dc48229c\"" Jan 24 00:58:36.064587 containerd[1682]: time="2026-01-24T00:58:36.064516189Z" level=info msg="CreateContainer within sandbox \"c95907dae8b1dd0b335eb404719b5ebdd429e35f13988537f8334ea5c995b8da\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"a7eb67c1e88285f0956e0d6c1782c24cdea9c45452426f0ff13a8a05d37b4cc2\"" Jan 24 00:58:36.064851 containerd[1682]: time="2026-01-24T00:58:36.064811159Z" level=info msg="connecting to shim 3dc97fad75dbcf9230e627dd7935d4cb7b35975c5f6acb5fc8a08087dc48229c" address="unix:///run/containerd/s/0249266bdae7f393ec8080b59834f0b7f8943ea10ac9194187eac7ebc74dfdb6" protocol=ttrpc version=3 Jan 24 00:58:36.065698 containerd[1682]: time="2026-01-24T00:58:36.065640548Z" level=info msg="StartContainer for \"a7eb67c1e88285f0956e0d6c1782c24cdea9c45452426f0ff13a8a05d37b4cc2\"" Jan 24 00:58:36.067319 containerd[1682]: time="2026-01-24T00:58:36.067170787Z" level=info msg="connecting to shim a7eb67c1e88285f0956e0d6c1782c24cdea9c45452426f0ff13a8a05d37b4cc2" address="unix:///run/containerd/s/4e60b15fb44aa56a176acf912469d108694efd9035acae36a1b93ca7161871ed" protocol=ttrpc version=3 Jan 24 00:58:36.115319 systemd[1]: Started cri-containerd-a7eb67c1e88285f0956e0d6c1782c24cdea9c45452426f0ff13a8a05d37b4cc2.scope - libcontainer container a7eb67c1e88285f0956e0d6c1782c24cdea9c45452426f0ff13a8a05d37b4cc2. Jan 24 00:58:36.120937 systemd[1]: Started cri-containerd-3dc97fad75dbcf9230e627dd7935d4cb7b35975c5f6acb5fc8a08087dc48229c.scope - libcontainer container 3dc97fad75dbcf9230e627dd7935d4cb7b35975c5f6acb5fc8a08087dc48229c. Jan 24 00:58:36.145000 audit: BPF prog-id=257 op=LOAD Jan 24 00:58:36.151100 kernel: audit: type=1334 audit(1769216316.145:1764): prog-id=257 op=LOAD Jan 24 00:58:36.147000 audit: BPF prog-id=258 op=LOAD Jan 24 00:58:36.157121 kernel: audit: type=1334 audit(1769216316.147:1765): prog-id=258 op=LOAD Jan 24 00:58:36.147000 audit[7891]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=3000 pid=7891 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:58:36.147000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6137656236376331653838323835663039353665306436633137383263 Jan 24 00:58:36.169423 kernel: audit: type=1300 audit(1769216316.147:1765): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=3000 pid=7891 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:58:36.169514 kernel: audit: type=1327 audit(1769216316.147:1765): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6137656236376331653838323835663039353665306436633137383263 Jan 24 00:58:36.147000 audit: BPF prog-id=258 op=UNLOAD Jan 24 00:58:36.147000 audit[7891]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3000 pid=7891 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:58:36.147000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6137656236376331653838323835663039353665306436633137383263 Jan 24 00:58:36.147000 audit: BPF prog-id=259 op=LOAD Jan 24 00:58:36.147000 audit[7891]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=3000 pid=7891 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:58:36.147000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6137656236376331653838323835663039353665306436633137383263 Jan 24 00:58:36.147000 audit: BPF prog-id=260 op=LOAD Jan 24 00:58:36.147000 audit[7891]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=3000 pid=7891 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:58:36.147000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6137656236376331653838323835663039353665306436633137383263 Jan 24 00:58:36.147000 audit: BPF prog-id=260 op=UNLOAD Jan 24 00:58:36.147000 audit[7891]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3000 pid=7891 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:58:36.147000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6137656236376331653838323835663039353665306436633137383263 Jan 24 00:58:36.147000 audit: BPF prog-id=259 op=UNLOAD Jan 24 00:58:36.147000 audit[7891]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3000 pid=7891 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:58:36.147000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6137656236376331653838323835663039353665306436633137383263 Jan 24 00:58:36.147000 audit: BPF prog-id=261 op=LOAD Jan 24 00:58:36.147000 audit[7891]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=3000 pid=7891 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:58:36.147000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6137656236376331653838323835663039353665306436633137383263 Jan 24 00:58:36.155000 audit: BPF prog-id=262 op=LOAD Jan 24 00:58:36.156000 audit: BPF prog-id=263 op=LOAD Jan 24 00:58:36.156000 audit[7892]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=2562 pid=7892 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:58:36.156000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3364633937666164373564626366393233306536323764643739333564 Jan 24 00:58:36.156000 audit: BPF prog-id=263 op=UNLOAD Jan 24 00:58:36.156000 audit[7892]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2562 pid=7892 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:58:36.156000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3364633937666164373564626366393233306536323764643739333564 Jan 24 00:58:36.156000 audit: BPF prog-id=264 op=LOAD Jan 24 00:58:36.156000 audit[7892]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=2562 pid=7892 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:58:36.156000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3364633937666164373564626366393233306536323764643739333564 Jan 24 00:58:36.156000 audit: BPF prog-id=265 op=LOAD Jan 24 00:58:36.156000 audit[7892]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=2562 pid=7892 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:58:36.156000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3364633937666164373564626366393233306536323764643739333564 Jan 24 00:58:36.156000 audit: BPF prog-id=265 op=UNLOAD Jan 24 00:58:36.156000 audit[7892]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2562 pid=7892 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:58:36.156000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3364633937666164373564626366393233306536323764643739333564 Jan 24 00:58:36.156000 audit: BPF prog-id=264 op=UNLOAD Jan 24 00:58:36.156000 audit[7892]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2562 pid=7892 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:58:36.156000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3364633937666164373564626366393233306536323764643739333564 Jan 24 00:58:36.156000 audit: BPF prog-id=266 op=LOAD Jan 24 00:58:36.156000 audit[7892]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=2562 pid=7892 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:58:36.156000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3364633937666164373564626366393233306536323764643739333564 Jan 24 00:58:36.196259 containerd[1682]: time="2026-01-24T00:58:36.196213513Z" level=info msg="StartContainer for \"a7eb67c1e88285f0956e0d6c1782c24cdea9c45452426f0ff13a8a05d37b4cc2\" returns successfully" Jan 24 00:58:36.228316 containerd[1682]: time="2026-01-24T00:58:36.228256957Z" level=info msg="StartContainer for \"3dc97fad75dbcf9230e627dd7935d4cb7b35975c5f6acb5fc8a08087dc48229c\" returns successfully" Jan 24 00:58:38.333624 kubelet[2863]: E0124 00:58:38.333390 2863 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:34796->10.0.0.2:2379: read: connection timed out" event="&Event{ObjectMeta:{kube-apiserver-ci-4593-0-0-9-1308b066bf.188d84dbe5931dc0 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-ci-4593-0-0-9-1308b066bf,UID:fb10a41fc95ee03e4fcf793bdb671159,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Liveness probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:ci-4593-0-0-9-1308b066bf,},FirstTimestamp:2026-01-24 00:58:28.610801088 +0000 UTC m=+785.142787945,LastTimestamp:2026-01-24 00:58:28.610801088 +0000 UTC m=+785.142787945,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4593-0-0-9-1308b066bf,}" Jan 24 00:58:40.465804 systemd[1]: cri-containerd-5e403805f2b37e6beed50b73ff4fa198fe796cdc3b3127bd77c823c5105a34ff.scope: Deactivated successfully. Jan 24 00:58:40.467170 systemd[1]: cri-containerd-5e403805f2b37e6beed50b73ff4fa198fe796cdc3b3127bd77c823c5105a34ff.scope: Consumed 6.143s CPU time, 24.1M memory peak, 172K read from disk. Jan 24 00:58:40.470785 containerd[1682]: time="2026-01-24T00:58:40.470401135Z" level=info msg="received container exit event container_id:\"5e403805f2b37e6beed50b73ff4fa198fe796cdc3b3127bd77c823c5105a34ff\" id:\"5e403805f2b37e6beed50b73ff4fa198fe796cdc3b3127bd77c823c5105a34ff\" pid:2671 exit_status:1 exited_at:{seconds:1769216320 nanos:469737934}" Jan 24 00:58:40.479831 kernel: kauditd_printk_skb: 40 callbacks suppressed Jan 24 00:58:40.479910 kernel: audit: type=1334 audit(1769216320.468:1780): prog-id=267 op=LOAD Jan 24 00:58:40.468000 audit: BPF prog-id=267 op=LOAD Jan 24 00:58:40.485154 kernel: audit: type=1334 audit(1769216320.468:1781): prog-id=83 op=UNLOAD Jan 24 00:58:40.468000 audit: BPF prog-id=83 op=UNLOAD Jan 24 00:58:40.469000 audit: BPF prog-id=98 op=UNLOAD Jan 24 00:58:40.469000 audit: BPF prog-id=102 op=UNLOAD Jan 24 00:58:40.493053 kernel: audit: type=1334 audit(1769216320.469:1782): prog-id=98 op=UNLOAD Jan 24 00:58:40.493182 kernel: audit: type=1334 audit(1769216320.469:1783): prog-id=102 op=UNLOAD Jan 24 00:58:40.527379 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-5e403805f2b37e6beed50b73ff4fa198fe796cdc3b3127bd77c823c5105a34ff-rootfs.mount: Deactivated successfully. Jan 24 00:58:40.616429 kubelet[2863]: E0124 00:58:40.616367 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58c54c478-wd6fd" podUID="dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5" Jan 24 00:58:41.045098 kubelet[2863]: I0124 00:58:41.045020 2863 scope.go:117] "RemoveContainer" containerID="5e403805f2b37e6beed50b73ff4fa198fe796cdc3b3127bd77c823c5105a34ff" Jan 24 00:58:41.047929 containerd[1682]: time="2026-01-24T00:58:41.047881074Z" level=info msg="CreateContainer within sandbox \"ee137d8e0ed0b974e1f268f0cc9485fff902e487d3e0090d2f94923889985059\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Jan 24 00:58:41.061588 containerd[1682]: time="2026-01-24T00:58:41.060536617Z" level=info msg="Container d54ae481c138391c00c6359fb0ec78a6205d3d43266fa5c5a41f4f4804773a4b: CDI devices from CRI Config.CDIDevices: []" Jan 24 00:58:41.073637 containerd[1682]: time="2026-01-24T00:58:41.073560021Z" level=info msg="CreateContainer within sandbox \"ee137d8e0ed0b974e1f268f0cc9485fff902e487d3e0090d2f94923889985059\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"d54ae481c138391c00c6359fb0ec78a6205d3d43266fa5c5a41f4f4804773a4b\"" Jan 24 00:58:41.074971 containerd[1682]: time="2026-01-24T00:58:41.074709300Z" level=info msg="StartContainer for \"d54ae481c138391c00c6359fb0ec78a6205d3d43266fa5c5a41f4f4804773a4b\"" Jan 24 00:58:41.076723 containerd[1682]: time="2026-01-24T00:58:41.076687990Z" level=info msg="connecting to shim d54ae481c138391c00c6359fb0ec78a6205d3d43266fa5c5a41f4f4804773a4b" address="unix:///run/containerd/s/ebf551b31e5c9b711e8b691f6e6f8b0fc219e0a6d3842e59f3ef4958802d1dfe" protocol=ttrpc version=3 Jan 24 00:58:41.119401 systemd[1]: Started cri-containerd-d54ae481c138391c00c6359fb0ec78a6205d3d43266fa5c5a41f4f4804773a4b.scope - libcontainer container d54ae481c138391c00c6359fb0ec78a6205d3d43266fa5c5a41f4f4804773a4b. Jan 24 00:58:41.142000 audit: BPF prog-id=268 op=LOAD Jan 24 00:58:41.149237 kernel: audit: type=1334 audit(1769216321.142:1784): prog-id=268 op=LOAD Jan 24 00:58:41.148000 audit: BPF prog-id=269 op=LOAD Jan 24 00:58:41.148000 audit[7964]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00010c238 a2=98 a3=0 items=0 ppid=2533 pid=7964 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:58:41.153817 kernel: audit: type=1334 audit(1769216321.148:1785): prog-id=269 op=LOAD Jan 24 00:58:41.153917 kernel: audit: type=1300 audit(1769216321.148:1785): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00010c238 a2=98 a3=0 items=0 ppid=2533 pid=7964 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:58:41.148000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6435346165343831633133383339316330306336333539666230656337 Jan 24 00:58:41.161626 kernel: audit: type=1327 audit(1769216321.148:1785): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6435346165343831633133383339316330306336333539666230656337 Jan 24 00:58:41.148000 audit: BPF prog-id=269 op=UNLOAD Jan 24 00:58:41.168147 kernel: audit: type=1334 audit(1769216321.148:1786): prog-id=269 op=UNLOAD Jan 24 00:58:41.148000 audit[7964]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2533 pid=7964 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:58:41.172345 kernel: audit: type=1300 audit(1769216321.148:1786): arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2533 pid=7964 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:58:41.148000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6435346165343831633133383339316330306336333539666230656337 Jan 24 00:58:41.148000 audit: BPF prog-id=270 op=LOAD Jan 24 00:58:41.148000 audit[7964]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00010c488 a2=98 a3=0 items=0 ppid=2533 pid=7964 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:58:41.148000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6435346165343831633133383339316330306336333539666230656337 Jan 24 00:58:41.148000 audit: BPF prog-id=271 op=LOAD Jan 24 00:58:41.148000 audit[7964]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00010c218 a2=98 a3=0 items=0 ppid=2533 pid=7964 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:58:41.148000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6435346165343831633133383339316330306336333539666230656337 Jan 24 00:58:41.149000 audit: BPF prog-id=271 op=UNLOAD Jan 24 00:58:41.149000 audit[7964]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2533 pid=7964 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:58:41.149000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6435346165343831633133383339316330306336333539666230656337 Jan 24 00:58:41.149000 audit: BPF prog-id=270 op=UNLOAD Jan 24 00:58:41.149000 audit[7964]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2533 pid=7964 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:58:41.149000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6435346165343831633133383339316330306336333539666230656337 Jan 24 00:58:41.149000 audit: BPF prog-id=272 op=LOAD Jan 24 00:58:41.149000 audit[7964]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00010c6e8 a2=98 a3=0 items=0 ppid=2533 pid=7964 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 24 00:58:41.149000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6435346165343831633133383339316330306336333539666230656337 Jan 24 00:58:41.224537 containerd[1682]: time="2026-01-24T00:58:41.224407485Z" level=info msg="StartContainer for \"d54ae481c138391c00c6359fb0ec78a6205d3d43266fa5c5a41f4f4804773a4b\" returns successfully" Jan 24 00:58:41.619658 kubelet[2863]: E0124 00:58:41.619606 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-xp2rb" podUID="d83d59e3-6296-40d9-bb63-5a69b654ac0c" Jan 24 00:58:42.616790 kubelet[2863]: E0124 00:58:42.616727 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:58:44.617310 kubelet[2863]: E0124 00:58:44.617175 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7779cd58b4-jsxfd" podUID="7fb26181-9fdc-4f96-be2c-85fbaa5f21b7" Jan 24 00:58:45.395189 kubelet[2863]: E0124 00:58:45.394710 2863 controller.go:195] "Failed to update lease" err="Put \"https://65.109.167.77:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4593-0-0-9-1308b066bf?timeout=10s\": context deadline exceeded" Jan 24 00:58:46.461616 kubelet[2863]: I0124 00:58:46.461495 2863 status_manager.go:895] "Failed to get status for pod" podUID="1deb031e-6e9b-46ac-8d9a-64b60fe3b32b" pod="tigera-operator/tigera-operator-7dcd859c48-w5567" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:34936->10.0.0.2:2379: read: connection timed out" Jan 24 00:58:46.616502 kubelet[2863]: E0124 00:58:46.616381 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qhldj" podUID="d0cc0ca8-4b85-478b-b9c2-c61b42d93c89" Jan 24 00:58:47.401843 systemd[1]: cri-containerd-a7eb67c1e88285f0956e0d6c1782c24cdea9c45452426f0ff13a8a05d37b4cc2.scope: Deactivated successfully. Jan 24 00:58:47.404439 containerd[1682]: time="2026-01-24T00:58:47.404393187Z" level=info msg="received container exit event container_id:\"a7eb67c1e88285f0956e0d6c1782c24cdea9c45452426f0ff13a8a05d37b4cc2\" id:\"a7eb67c1e88285f0956e0d6c1782c24cdea9c45452426f0ff13a8a05d37b4cc2\" pid:7915 exit_status:1 exited_at:{seconds:1769216327 nanos:403658028}" Jan 24 00:58:47.412285 kernel: kauditd_printk_skb: 16 callbacks suppressed Jan 24 00:58:47.412412 kernel: audit: type=1334 audit(1769216327.404:1792): prog-id=257 op=UNLOAD Jan 24 00:58:47.404000 audit: BPF prog-id=257 op=UNLOAD Jan 24 00:58:47.414799 kernel: audit: type=1334 audit(1769216327.404:1793): prog-id=261 op=UNLOAD Jan 24 00:58:47.404000 audit: BPF prog-id=261 op=UNLOAD Jan 24 00:58:47.480716 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-a7eb67c1e88285f0956e0d6c1782c24cdea9c45452426f0ff13a8a05d37b4cc2-rootfs.mount: Deactivated successfully. Jan 24 00:58:48.070918 kubelet[2863]: I0124 00:58:48.070838 2863 scope.go:117] "RemoveContainer" containerID="fb20456a76a221b5ce242f0d74a9ba6ac0ba3bc345c3042096c7036a3f9c13ce" Jan 24 00:58:48.071708 kubelet[2863]: I0124 00:58:48.071387 2863 scope.go:117] "RemoveContainer" containerID="a7eb67c1e88285f0956e0d6c1782c24cdea9c45452426f0ff13a8a05d37b4cc2" Jan 24 00:58:48.071708 kubelet[2863]: E0124 00:58:48.071641 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tigera-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=tigera-operator pod=tigera-operator-7dcd859c48-w5567_tigera-operator(1deb031e-6e9b-46ac-8d9a-64b60fe3b32b)\"" pod="tigera-operator/tigera-operator-7dcd859c48-w5567" podUID="1deb031e-6e9b-46ac-8d9a-64b60fe3b32b" Jan 24 00:58:48.074025 containerd[1682]: time="2026-01-24T00:58:48.073946071Z" level=info msg="RemoveContainer for \"fb20456a76a221b5ce242f0d74a9ba6ac0ba3bc345c3042096c7036a3f9c13ce\"" Jan 24 00:58:48.093951 containerd[1682]: time="2026-01-24T00:58:48.093869371Z" level=info msg="RemoveContainer for \"fb20456a76a221b5ce242f0d74a9ba6ac0ba3bc345c3042096c7036a3f9c13ce\" returns successfully" Jan 24 00:58:50.616897 kubelet[2863]: E0124 00:58:50.616819 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-njf24" podUID="12f6abfd-9f5a-45d2-b23b-65ea3c59cfbc" Jan 24 00:58:54.617025 kubelet[2863]: E0124 00:58:54.616959 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-58fdcd774c-w2drb" podUID="f546e732-cf0b-44c7-9678-ae1cb31a23a4" Jan 24 00:58:55.396617 kubelet[2863]: E0124 00:58:55.395579 2863 controller.go:195] "Failed to update lease" err="Put \"https://65.109.167.77:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4593-0-0-9-1308b066bf?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 24 00:58:55.617574 kubelet[2863]: E0124 00:58:55.617398 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58c54c478-wd6fd" podUID="dbbc15c6-c8eb-4fd5-85b0-95f3e37249b5"