Jan 14 01:30:33.143633 kernel: Linux version 6.12.65-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.1_p20250801 p4) 14.3.1 20250801, GNU ld (Gentoo 2.45 p3) 2.45.0) #1 SMP PREEMPT_DYNAMIC Tue Jan 13 22:26:24 -00 2026 Jan 14 01:30:33.143664 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack verity.usrhash=ef461ed71f713584f576c99df12ffb04dd99b33cd2d16edeb307d0cf2f5b4260 Jan 14 01:30:33.143674 kernel: BIOS-provided physical RAM map: Jan 14 01:30:33.143681 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ffff] usable Jan 14 01:30:33.143687 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000007fffff] usable Jan 14 01:30:33.143693 kernel: BIOS-e820: [mem 0x0000000000800000-0x0000000000807fff] ACPI NVS Jan 14 01:30:33.143702 kernel: BIOS-e820: [mem 0x0000000000808000-0x000000000080afff] usable Jan 14 01:30:33.143709 kernel: BIOS-e820: [mem 0x000000000080b000-0x000000000080bfff] ACPI NVS Jan 14 01:30:33.143715 kernel: BIOS-e820: [mem 0x000000000080c000-0x0000000000810fff] usable Jan 14 01:30:33.143722 kernel: BIOS-e820: [mem 0x0000000000811000-0x00000000008fffff] ACPI NVS Jan 14 01:30:33.143728 kernel: BIOS-e820: [mem 0x0000000000900000-0x000000007e93efff] usable Jan 14 01:30:33.143734 kernel: BIOS-e820: [mem 0x000000007e93f000-0x000000007e9fffff] reserved Jan 14 01:30:33.143740 kernel: BIOS-e820: [mem 0x000000007ea00000-0x000000007ec70fff] usable Jan 14 01:30:33.143747 kernel: BIOS-e820: [mem 0x000000007ec71000-0x000000007ed84fff] reserved Jan 14 01:30:33.143756 kernel: BIOS-e820: [mem 0x000000007ed85000-0x000000007f8ecfff] usable Jan 14 01:30:33.143763 kernel: BIOS-e820: [mem 0x000000007f8ed000-0x000000007fb6cfff] reserved Jan 14 01:30:33.143770 kernel: BIOS-e820: [mem 0x000000007fb6d000-0x000000007fb7efff] ACPI data Jan 14 01:30:33.143776 kernel: BIOS-e820: [mem 0x000000007fb7f000-0x000000007fbfefff] ACPI NVS Jan 14 01:30:33.143783 kernel: BIOS-e820: [mem 0x000000007fbff000-0x000000007feaefff] usable Jan 14 01:30:33.143789 kernel: BIOS-e820: [mem 0x000000007feaf000-0x000000007feb2fff] reserved Jan 14 01:30:33.143797 kernel: BIOS-e820: [mem 0x000000007feb3000-0x000000007feb4fff] ACPI NVS Jan 14 01:30:33.143804 kernel: BIOS-e820: [mem 0x000000007feb5000-0x000000007feebfff] usable Jan 14 01:30:33.143810 kernel: BIOS-e820: [mem 0x000000007feec000-0x000000007ff6ffff] reserved Jan 14 01:30:33.143817 kernel: BIOS-e820: [mem 0x000000007ff70000-0x000000007fffffff] ACPI NVS Jan 14 01:30:33.143823 kernel: BIOS-e820: [mem 0x00000000e0000000-0x00000000efffffff] reserved Jan 14 01:30:33.143830 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Jan 14 01:30:33.143836 kernel: BIOS-e820: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Jan 14 01:30:33.143843 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000017fffffff] usable Jan 14 01:30:33.143849 kernel: NX (Execute Disable) protection: active Jan 14 01:30:33.143856 kernel: APIC: Static calls initialized Jan 14 01:30:33.143862 kernel: e820: update [mem 0x7df7f018-0x7df88a57] usable ==> usable Jan 14 01:30:33.143871 kernel: e820: update [mem 0x7df57018-0x7df7e457] usable ==> usable Jan 14 01:30:33.143877 kernel: extended physical RAM map: Jan 14 01:30:33.143884 kernel: reserve setup_data: [mem 0x0000000000000000-0x000000000009ffff] usable Jan 14 01:30:33.143891 kernel: reserve setup_data: [mem 0x0000000000100000-0x00000000007fffff] usable Jan 14 01:30:33.143897 kernel: reserve setup_data: [mem 0x0000000000800000-0x0000000000807fff] ACPI NVS Jan 14 01:30:33.143904 kernel: reserve setup_data: [mem 0x0000000000808000-0x000000000080afff] usable Jan 14 01:30:33.143910 kernel: reserve setup_data: [mem 0x000000000080b000-0x000000000080bfff] ACPI NVS Jan 14 01:30:33.143928 kernel: reserve setup_data: [mem 0x000000000080c000-0x0000000000810fff] usable Jan 14 01:30:33.143935 kernel: reserve setup_data: [mem 0x0000000000811000-0x00000000008fffff] ACPI NVS Jan 14 01:30:33.143946 kernel: reserve setup_data: [mem 0x0000000000900000-0x000000007df57017] usable Jan 14 01:30:33.143953 kernel: reserve setup_data: [mem 0x000000007df57018-0x000000007df7e457] usable Jan 14 01:30:33.143960 kernel: reserve setup_data: [mem 0x000000007df7e458-0x000000007df7f017] usable Jan 14 01:30:33.143967 kernel: reserve setup_data: [mem 0x000000007df7f018-0x000000007df88a57] usable Jan 14 01:30:33.143975 kernel: reserve setup_data: [mem 0x000000007df88a58-0x000000007e93efff] usable Jan 14 01:30:33.143995 kernel: reserve setup_data: [mem 0x000000007e93f000-0x000000007e9fffff] reserved Jan 14 01:30:33.144002 kernel: reserve setup_data: [mem 0x000000007ea00000-0x000000007ec70fff] usable Jan 14 01:30:33.144009 kernel: reserve setup_data: [mem 0x000000007ec71000-0x000000007ed84fff] reserved Jan 14 01:30:33.144016 kernel: reserve setup_data: [mem 0x000000007ed85000-0x000000007f8ecfff] usable Jan 14 01:30:33.144023 kernel: reserve setup_data: [mem 0x000000007f8ed000-0x000000007fb6cfff] reserved Jan 14 01:30:33.144030 kernel: reserve setup_data: [mem 0x000000007fb6d000-0x000000007fb7efff] ACPI data Jan 14 01:30:33.144037 kernel: reserve setup_data: [mem 0x000000007fb7f000-0x000000007fbfefff] ACPI NVS Jan 14 01:30:33.144044 kernel: reserve setup_data: [mem 0x000000007fbff000-0x000000007feaefff] usable Jan 14 01:30:33.144051 kernel: reserve setup_data: [mem 0x000000007feaf000-0x000000007feb2fff] reserved Jan 14 01:30:33.144058 kernel: reserve setup_data: [mem 0x000000007feb3000-0x000000007feb4fff] ACPI NVS Jan 14 01:30:33.144067 kernel: reserve setup_data: [mem 0x000000007feb5000-0x000000007feebfff] usable Jan 14 01:30:33.144074 kernel: reserve setup_data: [mem 0x000000007feec000-0x000000007ff6ffff] reserved Jan 14 01:30:33.144081 kernel: reserve setup_data: [mem 0x000000007ff70000-0x000000007fffffff] ACPI NVS Jan 14 01:30:33.144088 kernel: reserve setup_data: [mem 0x00000000e0000000-0x00000000efffffff] reserved Jan 14 01:30:33.144095 kernel: reserve setup_data: [mem 0x00000000feffc000-0x00000000feffffff] reserved Jan 14 01:30:33.144102 kernel: reserve setup_data: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Jan 14 01:30:33.144109 kernel: reserve setup_data: [mem 0x0000000100000000-0x000000017fffffff] usable Jan 14 01:30:33.144116 kernel: efi: EFI v2.7 by EDK II Jan 14 01:30:33.144130 kernel: efi: SMBIOS=0x7f972000 ACPI=0x7fb7e000 ACPI 2.0=0x7fb7e014 MEMATTR=0x7dfd8018 RNG=0x7fb72018 Jan 14 01:30:33.144137 kernel: random: crng init done Jan 14 01:30:33.144144 kernel: efi: Remove mem139: MMIO range=[0xffc00000-0xffffffff] (4MB) from e820 map Jan 14 01:30:33.144153 kernel: e820: remove [mem 0xffc00000-0xffffffff] reserved Jan 14 01:30:33.144160 kernel: secureboot: Secure boot disabled Jan 14 01:30:33.144167 kernel: SMBIOS 2.8 present. Jan 14 01:30:33.144174 kernel: DMI: STACKIT Cloud OpenStack Nova/Standard PC (Q35 + ICH9, 2009), BIOS unknown 02/02/2022 Jan 14 01:30:33.144181 kernel: DMI: Memory slots populated: 1/1 Jan 14 01:30:33.144188 kernel: Hypervisor detected: KVM Jan 14 01:30:33.144195 kernel: last_pfn = 0x7feec max_arch_pfn = 0x10000000000 Jan 14 01:30:33.144202 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Jan 14 01:30:33.144208 kernel: kvm-clock: using sched offset of 5719803720 cycles Jan 14 01:30:33.144216 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Jan 14 01:30:33.144226 kernel: tsc: Detected 2294.608 MHz processor Jan 14 01:30:33.144234 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Jan 14 01:30:33.144242 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Jan 14 01:30:33.144249 kernel: last_pfn = 0x180000 max_arch_pfn = 0x10000000000 Jan 14 01:30:33.144257 kernel: MTRR map: 4 entries (2 fixed + 2 variable; max 18), built from 8 variable MTRRs Jan 14 01:30:33.144264 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Jan 14 01:30:33.144271 kernel: last_pfn = 0x7feec max_arch_pfn = 0x10000000000 Jan 14 01:30:33.144279 kernel: Using GB pages for direct mapping Jan 14 01:30:33.144288 kernel: ACPI: Early table checksum verification disabled Jan 14 01:30:33.144295 kernel: ACPI: RSDP 0x000000007FB7E014 000024 (v02 BOCHS ) Jan 14 01:30:33.144303 kernel: ACPI: XSDT 0x000000007FB7D0E8 00004C (v01 BOCHS BXPC 00000001 01000013) Jan 14 01:30:33.144310 kernel: ACPI: FACP 0x000000007FB77000 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 14 01:30:33.144317 kernel: ACPI: DSDT 0x000000007FB78000 00423C (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 14 01:30:33.144325 kernel: ACPI: FACS 0x000000007FBDD000 000040 Jan 14 01:30:33.144332 kernel: ACPI: APIC 0x000000007FB76000 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 14 01:30:33.144341 kernel: ACPI: MCFG 0x000000007FB75000 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 14 01:30:33.144348 kernel: ACPI: WAET 0x000000007FB74000 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 14 01:30:33.144355 kernel: ACPI: BGRT 0x000000007FB73000 000038 (v01 INTEL EDK2 00000002 01000013) Jan 14 01:30:33.144363 kernel: ACPI: Reserving FACP table memory at [mem 0x7fb77000-0x7fb770f3] Jan 14 01:30:33.144370 kernel: ACPI: Reserving DSDT table memory at [mem 0x7fb78000-0x7fb7c23b] Jan 14 01:30:33.144377 kernel: ACPI: Reserving FACS table memory at [mem 0x7fbdd000-0x7fbdd03f] Jan 14 01:30:33.144384 kernel: ACPI: Reserving APIC table memory at [mem 0x7fb76000-0x7fb7607f] Jan 14 01:30:33.144394 kernel: ACPI: Reserving MCFG table memory at [mem 0x7fb75000-0x7fb7503b] Jan 14 01:30:33.144401 kernel: ACPI: Reserving WAET table memory at [mem 0x7fb74000-0x7fb74027] Jan 14 01:30:33.144408 kernel: ACPI: Reserving BGRT table memory at [mem 0x7fb73000-0x7fb73037] Jan 14 01:30:33.144416 kernel: No NUMA configuration found Jan 14 01:30:33.144423 kernel: Faking a node at [mem 0x0000000000000000-0x000000017fffffff] Jan 14 01:30:33.144430 kernel: NODE_DATA(0) allocated [mem 0x17fff8dc0-0x17fffffff] Jan 14 01:30:33.144438 kernel: Zone ranges: Jan 14 01:30:33.144445 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Jan 14 01:30:33.144455 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Jan 14 01:30:33.144462 kernel: Normal [mem 0x0000000100000000-0x000000017fffffff] Jan 14 01:30:33.144470 kernel: Device empty Jan 14 01:30:33.144477 kernel: Movable zone start for each node Jan 14 01:30:33.144484 kernel: Early memory node ranges Jan 14 01:30:33.144491 kernel: node 0: [mem 0x0000000000001000-0x000000000009ffff] Jan 14 01:30:33.144499 kernel: node 0: [mem 0x0000000000100000-0x00000000007fffff] Jan 14 01:30:33.144509 kernel: node 0: [mem 0x0000000000808000-0x000000000080afff] Jan 14 01:30:33.144516 kernel: node 0: [mem 0x000000000080c000-0x0000000000810fff] Jan 14 01:30:33.144523 kernel: node 0: [mem 0x0000000000900000-0x000000007e93efff] Jan 14 01:30:33.144530 kernel: node 0: [mem 0x000000007ea00000-0x000000007ec70fff] Jan 14 01:30:33.144538 kernel: node 0: [mem 0x000000007ed85000-0x000000007f8ecfff] Jan 14 01:30:33.144551 kernel: node 0: [mem 0x000000007fbff000-0x000000007feaefff] Jan 14 01:30:33.144561 kernel: node 0: [mem 0x000000007feb5000-0x000000007feebfff] Jan 14 01:30:33.144569 kernel: node 0: [mem 0x0000000100000000-0x000000017fffffff] Jan 14 01:30:33.144577 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000017fffffff] Jan 14 01:30:33.144585 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Jan 14 01:30:33.144595 kernel: On node 0, zone DMA: 96 pages in unavailable ranges Jan 14 01:30:33.144602 kernel: On node 0, zone DMA: 8 pages in unavailable ranges Jan 14 01:30:33.144611 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Jan 14 01:30:33.144618 kernel: On node 0, zone DMA: 239 pages in unavailable ranges Jan 14 01:30:33.144628 kernel: On node 0, zone DMA32: 193 pages in unavailable ranges Jan 14 01:30:33.144636 kernel: On node 0, zone DMA32: 276 pages in unavailable ranges Jan 14 01:30:33.144644 kernel: On node 0, zone DMA32: 786 pages in unavailable ranges Jan 14 01:30:33.144652 kernel: On node 0, zone DMA32: 6 pages in unavailable ranges Jan 14 01:30:33.144660 kernel: On node 0, zone Normal: 276 pages in unavailable ranges Jan 14 01:30:33.144668 kernel: ACPI: PM-Timer IO Port: 0x608 Jan 14 01:30:33.144676 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Jan 14 01:30:33.144684 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Jan 14 01:30:33.144694 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Jan 14 01:30:33.144702 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Jan 14 01:30:33.144710 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Jan 14 01:30:33.144718 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Jan 14 01:30:33.144726 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Jan 14 01:30:33.144734 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Jan 14 01:30:33.144742 kernel: TSC deadline timer available Jan 14 01:30:33.144752 kernel: CPU topo: Max. logical packages: 2 Jan 14 01:30:33.144760 kernel: CPU topo: Max. logical dies: 2 Jan 14 01:30:33.144768 kernel: CPU topo: Max. dies per package: 1 Jan 14 01:30:33.144776 kernel: CPU topo: Max. threads per core: 1 Jan 14 01:30:33.144783 kernel: CPU topo: Num. cores per package: 1 Jan 14 01:30:33.144791 kernel: CPU topo: Num. threads per package: 1 Jan 14 01:30:33.144800 kernel: CPU topo: Allowing 2 present CPUs plus 0 hotplug CPUs Jan 14 01:30:33.144809 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Jan 14 01:30:33.144817 kernel: kvm-guest: KVM setup pv remote TLB flush Jan 14 01:30:33.144825 kernel: kvm-guest: setup PV sched yield Jan 14 01:30:33.144833 kernel: [mem 0x80000000-0xdfffffff] available for PCI devices Jan 14 01:30:33.144841 kernel: Booting paravirtualized kernel on KVM Jan 14 01:30:33.144849 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Jan 14 01:30:33.144858 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Jan 14 01:30:33.144866 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u1048576 Jan 14 01:30:33.144876 kernel: pcpu-alloc: s207832 r8192 d29736 u1048576 alloc=1*2097152 Jan 14 01:30:33.144884 kernel: pcpu-alloc: [0] 0 1 Jan 14 01:30:33.144892 kernel: kvm-guest: PV spinlocks enabled Jan 14 01:30:33.144900 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Jan 14 01:30:33.144910 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack verity.usrhash=ef461ed71f713584f576c99df12ffb04dd99b33cd2d16edeb307d0cf2f5b4260 Jan 14 01:30:33.144918 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Jan 14 01:30:33.144928 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Jan 14 01:30:33.144936 kernel: Fallback order for Node 0: 0 Jan 14 01:30:33.144944 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1046694 Jan 14 01:30:33.144952 kernel: Policy zone: Normal Jan 14 01:30:33.144960 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Jan 14 01:30:33.144968 kernel: software IO TLB: area num 2. Jan 14 01:30:33.144976 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Jan 14 01:30:33.144998 kernel: ftrace: allocating 40128 entries in 157 pages Jan 14 01:30:33.145006 kernel: ftrace: allocated 157 pages with 5 groups Jan 14 01:30:33.145014 kernel: Dynamic Preempt: voluntary Jan 14 01:30:33.145022 kernel: rcu: Preemptible hierarchical RCU implementation. Jan 14 01:30:33.145031 kernel: rcu: RCU event tracing is enabled. Jan 14 01:30:33.145039 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Jan 14 01:30:33.145048 kernel: Trampoline variant of Tasks RCU enabled. Jan 14 01:30:33.145056 kernel: Rude variant of Tasks RCU enabled. Jan 14 01:30:33.145065 kernel: Tracing variant of Tasks RCU enabled. Jan 14 01:30:33.145073 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Jan 14 01:30:33.145081 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Jan 14 01:30:33.145089 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jan 14 01:30:33.145097 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jan 14 01:30:33.145106 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jan 14 01:30:33.145114 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 Jan 14 01:30:33.145123 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Jan 14 01:30:33.145131 kernel: Console: colour dummy device 80x25 Jan 14 01:30:33.145139 kernel: printk: legacy console [tty0] enabled Jan 14 01:30:33.145147 kernel: printk: legacy console [ttyS0] enabled Jan 14 01:30:33.145155 kernel: ACPI: Core revision 20240827 Jan 14 01:30:33.145163 kernel: APIC: Switch to symmetric I/O mode setup Jan 14 01:30:33.145171 kernel: x2apic enabled Jan 14 01:30:33.145181 kernel: APIC: Switched APIC routing to: physical x2apic Jan 14 01:30:33.145189 kernel: kvm-guest: APIC: send_IPI_mask() replaced with kvm_send_ipi_mask() Jan 14 01:30:33.145197 kernel: kvm-guest: APIC: send_IPI_mask_allbutself() replaced with kvm_send_ipi_mask_allbutself() Jan 14 01:30:33.145205 kernel: kvm-guest: setup PV IPIs Jan 14 01:30:33.145213 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x21134f58f0d, max_idle_ns: 440795217993 ns Jan 14 01:30:33.145221 kernel: Calibrating delay loop (skipped) preset value.. 4589.21 BogoMIPS (lpj=2294608) Jan 14 01:30:33.145229 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Jan 14 01:30:33.145239 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 Jan 14 01:30:33.145246 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 Jan 14 01:30:33.145254 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Jan 14 01:30:33.145261 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall and VM exit Jan 14 01:30:33.145269 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Jan 14 01:30:33.145277 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Jan 14 01:30:33.145284 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Jan 14 01:30:33.145292 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Jan 14 01:30:33.145299 kernel: TAA: Mitigation: Clear CPU buffers Jan 14 01:30:33.145307 kernel: MMIO Stale Data: Mitigation: Clear CPU buffers Jan 14 01:30:33.145316 kernel: active return thunk: its_return_thunk Jan 14 01:30:33.145323 kernel: ITS: Mitigation: Aligned branch/return thunks Jan 14 01:30:33.145331 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Jan 14 01:30:33.145338 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Jan 14 01:30:33.145346 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Jan 14 01:30:33.145353 kernel: x86/fpu: Supporting XSAVE feature 0x020: 'AVX-512 opmask' Jan 14 01:30:33.145361 kernel: x86/fpu: Supporting XSAVE feature 0x040: 'AVX-512 Hi256' Jan 14 01:30:33.145368 kernel: x86/fpu: Supporting XSAVE feature 0x080: 'AVX-512 ZMM_Hi256' Jan 14 01:30:33.145376 kernel: x86/fpu: Supporting XSAVE feature 0x200: 'Protection Keys User registers' Jan 14 01:30:33.145383 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Jan 14 01:30:33.145392 kernel: x86/fpu: xstate_offset[5]: 832, xstate_sizes[5]: 64 Jan 14 01:30:33.145400 kernel: x86/fpu: xstate_offset[6]: 896, xstate_sizes[6]: 512 Jan 14 01:30:33.145408 kernel: x86/fpu: xstate_offset[7]: 1408, xstate_sizes[7]: 1024 Jan 14 01:30:33.145415 kernel: x86/fpu: xstate_offset[9]: 2432, xstate_sizes[9]: 8 Jan 14 01:30:33.145423 kernel: x86/fpu: Enabled xstate features 0x2e7, context size is 2440 bytes, using 'compacted' format. Jan 14 01:30:33.145430 kernel: Freeing SMP alternatives memory: 32K Jan 14 01:30:33.145438 kernel: pid_max: default: 32768 minimum: 301 Jan 14 01:30:33.145445 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Jan 14 01:30:33.145453 kernel: landlock: Up and running. Jan 14 01:30:33.145460 kernel: SELinux: Initializing. Jan 14 01:30:33.145468 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Jan 14 01:30:33.145477 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Jan 14 01:30:33.145485 kernel: smpboot: CPU0: Intel(R) Xeon(R) Silver 4316 CPU @ 2.30GHz (family: 0x6, model: 0x6a, stepping: 0x6) Jan 14 01:30:33.145492 kernel: Performance Events: PEBS fmt0-, Icelake events, full-width counters, Intel PMU driver. Jan 14 01:30:33.145501 kernel: ... version: 2 Jan 14 01:30:33.145509 kernel: ... bit width: 48 Jan 14 01:30:33.145517 kernel: ... generic registers: 8 Jan 14 01:30:33.145525 kernel: ... value mask: 0000ffffffffffff Jan 14 01:30:33.145533 kernel: ... max period: 00007fffffffffff Jan 14 01:30:33.145543 kernel: ... fixed-purpose events: 3 Jan 14 01:30:33.145551 kernel: ... event mask: 00000007000000ff Jan 14 01:30:33.145558 kernel: signal: max sigframe size: 3632 Jan 14 01:30:33.145566 kernel: rcu: Hierarchical SRCU implementation. Jan 14 01:30:33.145574 kernel: rcu: Max phase no-delay instances is 400. Jan 14 01:30:33.145582 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Jan 14 01:30:33.145591 kernel: smp: Bringing up secondary CPUs ... Jan 14 01:30:33.145601 kernel: smpboot: x86: Booting SMP configuration: Jan 14 01:30:33.145609 kernel: .... node #0, CPUs: #1 Jan 14 01:30:33.145617 kernel: smp: Brought up 1 node, 2 CPUs Jan 14 01:30:33.145625 kernel: smpboot: Total of 2 processors activated (9178.43 BogoMIPS) Jan 14 01:30:33.145634 kernel: Memory: 3969768K/4186776K available (14336K kernel code, 2445K rwdata, 31644K rodata, 15536K init, 2500K bss, 212128K reserved, 0K cma-reserved) Jan 14 01:30:33.145642 kernel: devtmpfs: initialized Jan 14 01:30:33.145650 kernel: x86/mm: Memory block size: 128MB Jan 14 01:30:33.145661 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x00800000-0x00807fff] (32768 bytes) Jan 14 01:30:33.145669 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x0080b000-0x0080bfff] (4096 bytes) Jan 14 01:30:33.145677 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x00811000-0x008fffff] (978944 bytes) Jan 14 01:30:33.145685 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7fb7f000-0x7fbfefff] (524288 bytes) Jan 14 01:30:33.145693 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7feb3000-0x7feb4fff] (8192 bytes) Jan 14 01:30:33.145701 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7ff70000-0x7fffffff] (589824 bytes) Jan 14 01:30:33.145709 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Jan 14 01:30:33.145720 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Jan 14 01:30:33.145728 kernel: pinctrl core: initialized pinctrl subsystem Jan 14 01:30:33.145736 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Jan 14 01:30:33.145744 kernel: audit: initializing netlink subsys (disabled) Jan 14 01:30:33.145752 kernel: audit: type=2000 audit(1768354230.166:1): state=initialized audit_enabled=0 res=1 Jan 14 01:30:33.145760 kernel: thermal_sys: Registered thermal governor 'step_wise' Jan 14 01:30:33.145767 kernel: thermal_sys: Registered thermal governor 'user_space' Jan 14 01:30:33.145777 kernel: cpuidle: using governor menu Jan 14 01:30:33.145785 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Jan 14 01:30:33.145793 kernel: dca service started, version 1.12.1 Jan 14 01:30:33.145801 kernel: PCI: ECAM [mem 0xe0000000-0xefffffff] (base 0xe0000000) for domain 0000 [bus 00-ff] Jan 14 01:30:33.145809 kernel: PCI: Using configuration type 1 for base access Jan 14 01:30:33.145817 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Jan 14 01:30:33.145825 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Jan 14 01:30:33.145835 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Jan 14 01:30:33.145843 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Jan 14 01:30:33.145851 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Jan 14 01:30:33.145859 kernel: ACPI: Added _OSI(Module Device) Jan 14 01:30:33.145867 kernel: ACPI: Added _OSI(Processor Device) Jan 14 01:30:33.145875 kernel: ACPI: Added _OSI(Processor Aggregator Device) Jan 14 01:30:33.145883 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Jan 14 01:30:33.145891 kernel: ACPI: Interpreter enabled Jan 14 01:30:33.145901 kernel: ACPI: PM: (supports S0 S3 S5) Jan 14 01:30:33.145908 kernel: ACPI: Using IOAPIC for interrupt routing Jan 14 01:30:33.145916 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Jan 14 01:30:33.145924 kernel: PCI: Using E820 reservations for host bridge windows Jan 14 01:30:33.145932 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Jan 14 01:30:33.145940 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Jan 14 01:30:33.146139 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Jan 14 01:30:33.146247 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Jan 14 01:30:33.146346 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Jan 14 01:30:33.146356 kernel: PCI host bridge to bus 0000:00 Jan 14 01:30:33.146465 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Jan 14 01:30:33.146556 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Jan 14 01:30:33.146646 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Jan 14 01:30:33.146733 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xdfffffff window] Jan 14 01:30:33.146821 kernel: pci_bus 0000:00: root bus resource [mem 0xf0000000-0xfebfffff window] Jan 14 01:30:33.146907 kernel: pci_bus 0000:00: root bus resource [mem 0x380000000000-0x38e800003fff window] Jan 14 01:30:33.147007 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Jan 14 01:30:33.147125 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint Jan 14 01:30:33.147241 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint Jan 14 01:30:33.147339 kernel: pci 0000:00:01.0: BAR 0 [mem 0x80000000-0x807fffff pref] Jan 14 01:30:33.147441 kernel: pci 0000:00:01.0: BAR 2 [mem 0x38e800000000-0x38e800003fff 64bit pref] Jan 14 01:30:33.147536 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8439e000-0x8439efff] Jan 14 01:30:33.147632 kernel: pci 0000:00:01.0: ROM [mem 0xffff0000-0xffffffff pref] Jan 14 01:30:33.147734 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Jan 14 01:30:33.147848 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:30:33.147959 kernel: pci 0000:00:02.0: BAR 0 [mem 0x8439d000-0x8439dfff] Jan 14 01:30:33.148070 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Jan 14 01:30:33.148169 kernel: pci 0000:00:02.0: bridge window [io 0x6000-0x6fff] Jan 14 01:30:33.148269 kernel: pci 0000:00:02.0: bridge window [mem 0x84000000-0x842fffff] Jan 14 01:30:33.148373 kernel: pci 0000:00:02.0: bridge window [mem 0x380000000000-0x3807ffffffff 64bit pref] Jan 14 01:30:33.148480 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:30:33.148581 kernel: pci 0000:00:02.1: BAR 0 [mem 0x8439c000-0x8439cfff] Jan 14 01:30:33.148680 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Jan 14 01:30:33.148778 kernel: pci 0000:00:02.1: bridge window [mem 0x83e00000-0x83ffffff] Jan 14 01:30:33.148880 kernel: pci 0000:00:02.1: bridge window [mem 0x380800000000-0x380fffffffff 64bit pref] Jan 14 01:30:33.148998 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:30:33.149100 kernel: pci 0000:00:02.2: BAR 0 [mem 0x8439b000-0x8439bfff] Jan 14 01:30:33.149198 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Jan 14 01:30:33.149295 kernel: pci 0000:00:02.2: bridge window [mem 0x83c00000-0x83dfffff] Jan 14 01:30:33.149398 kernel: pci 0000:00:02.2: bridge window [mem 0x381000000000-0x3817ffffffff 64bit pref] Jan 14 01:30:33.149508 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:30:33.149609 kernel: pci 0000:00:02.3: BAR 0 [mem 0x8439a000-0x8439afff] Jan 14 01:30:33.149711 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Jan 14 01:30:33.149816 kernel: pci 0000:00:02.3: bridge window [mem 0x83a00000-0x83bfffff] Jan 14 01:30:33.149918 kernel: pci 0000:00:02.3: bridge window [mem 0x381800000000-0x381fffffffff 64bit pref] Jan 14 01:30:33.150037 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:30:33.150141 kernel: pci 0000:00:02.4: BAR 0 [mem 0x84399000-0x84399fff] Jan 14 01:30:33.150242 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Jan 14 01:30:33.150340 kernel: pci 0000:00:02.4: bridge window [mem 0x83800000-0x839fffff] Jan 14 01:30:33.150441 kernel: pci 0000:00:02.4: bridge window [mem 0x382000000000-0x3827ffffffff 64bit pref] Jan 14 01:30:33.150554 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:30:33.150659 kernel: pci 0000:00:02.5: BAR 0 [mem 0x84398000-0x84398fff] Jan 14 01:30:33.150763 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Jan 14 01:30:33.150861 kernel: pci 0000:00:02.5: bridge window [mem 0x83600000-0x837fffff] Jan 14 01:30:33.150961 kernel: pci 0000:00:02.5: bridge window [mem 0x382800000000-0x382fffffffff 64bit pref] Jan 14 01:30:33.151087 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:30:33.151186 kernel: pci 0000:00:02.6: BAR 0 [mem 0x84397000-0x84397fff] Jan 14 01:30:33.151292 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Jan 14 01:30:33.151390 kernel: pci 0000:00:02.6: bridge window [mem 0x83400000-0x835fffff] Jan 14 01:30:33.151489 kernel: pci 0000:00:02.6: bridge window [mem 0x383000000000-0x3837ffffffff 64bit pref] Jan 14 01:30:33.151593 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:30:33.151691 kernel: pci 0000:00:02.7: BAR 0 [mem 0x84396000-0x84396fff] Jan 14 01:30:33.151790 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Jan 14 01:30:33.151897 kernel: pci 0000:00:02.7: bridge window [mem 0x83200000-0x833fffff] Jan 14 01:30:33.152022 kernel: pci 0000:00:02.7: bridge window [mem 0x383800000000-0x383fffffffff 64bit pref] Jan 14 01:30:33.152134 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:30:33.152237 kernel: pci 0000:00:03.0: BAR 0 [mem 0x84395000-0x84395fff] Jan 14 01:30:33.152336 kernel: pci 0000:00:03.0: PCI bridge to [bus 0a] Jan 14 01:30:33.152434 kernel: pci 0000:00:03.0: bridge window [mem 0x83000000-0x831fffff] Jan 14 01:30:33.152540 kernel: pci 0000:00:03.0: bridge window [mem 0x384000000000-0x3847ffffffff 64bit pref] Jan 14 01:30:33.152654 kernel: pci 0000:00:03.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:30:33.152774 kernel: pci 0000:00:03.1: BAR 0 [mem 0x84394000-0x84394fff] Jan 14 01:30:33.152878 kernel: pci 0000:00:03.1: PCI bridge to [bus 0b] Jan 14 01:30:33.152979 kernel: pci 0000:00:03.1: bridge window [mem 0x82e00000-0x82ffffff] Jan 14 01:30:33.153104 kernel: pci 0000:00:03.1: bridge window [mem 0x384800000000-0x384fffffffff 64bit pref] Jan 14 01:30:33.153236 kernel: pci 0000:00:03.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:30:33.153337 kernel: pci 0000:00:03.2: BAR 0 [mem 0x84393000-0x84393fff] Jan 14 01:30:33.153443 kernel: pci 0000:00:03.2: PCI bridge to [bus 0c] Jan 14 01:30:33.153541 kernel: pci 0000:00:03.2: bridge window [mem 0x82c00000-0x82dfffff] Jan 14 01:30:33.153645 kernel: pci 0000:00:03.2: bridge window [mem 0x385000000000-0x3857ffffffff 64bit pref] Jan 14 01:30:33.153750 kernel: pci 0000:00:03.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:30:33.153856 kernel: pci 0000:00:03.3: BAR 0 [mem 0x84392000-0x84392fff] Jan 14 01:30:33.153961 kernel: pci 0000:00:03.3: PCI bridge to [bus 0d] Jan 14 01:30:33.154075 kernel: pci 0000:00:03.3: bridge window [mem 0x82a00000-0x82bfffff] Jan 14 01:30:33.154183 kernel: pci 0000:00:03.3: bridge window [mem 0x385800000000-0x385fffffffff 64bit pref] Jan 14 01:30:33.154779 kernel: pci 0000:00:03.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:30:33.154906 kernel: pci 0000:00:03.4: BAR 0 [mem 0x84391000-0x84391fff] Jan 14 01:30:33.155020 kernel: pci 0000:00:03.4: PCI bridge to [bus 0e] Jan 14 01:30:33.155118 kernel: pci 0000:00:03.4: bridge window [mem 0x82800000-0x829fffff] Jan 14 01:30:33.155214 kernel: pci 0000:00:03.4: bridge window [mem 0x386000000000-0x3867ffffffff 64bit pref] Jan 14 01:30:33.155328 kernel: pci 0000:00:03.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:30:33.155425 kernel: pci 0000:00:03.5: BAR 0 [mem 0x84390000-0x84390fff] Jan 14 01:30:33.155521 kernel: pci 0000:00:03.5: PCI bridge to [bus 0f] Jan 14 01:30:33.155617 kernel: pci 0000:00:03.5: bridge window [mem 0x82600000-0x827fffff] Jan 14 01:30:33.155712 kernel: pci 0000:00:03.5: bridge window [mem 0x386800000000-0x386fffffffff 64bit pref] Jan 14 01:30:33.155815 kernel: pci 0000:00:03.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:30:33.155922 kernel: pci 0000:00:03.6: BAR 0 [mem 0x8438f000-0x8438ffff] Jan 14 01:30:33.156032 kernel: pci 0000:00:03.6: PCI bridge to [bus 10] Jan 14 01:30:33.156129 kernel: pci 0000:00:03.6: bridge window [mem 0x82400000-0x825fffff] Jan 14 01:30:33.156225 kernel: pci 0000:00:03.6: bridge window [mem 0x387000000000-0x3877ffffffff 64bit pref] Jan 14 01:30:33.156329 kernel: pci 0000:00:03.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:30:33.156427 kernel: pci 0000:00:03.7: BAR 0 [mem 0x8438e000-0x8438efff] Jan 14 01:30:33.156526 kernel: pci 0000:00:03.7: PCI bridge to [bus 11] Jan 14 01:30:33.156642 kernel: pci 0000:00:03.7: bridge window [mem 0x82200000-0x823fffff] Jan 14 01:30:33.157076 kernel: pci 0000:00:03.7: bridge window [mem 0x387800000000-0x387fffffffff 64bit pref] Jan 14 01:30:33.157201 kernel: pci 0000:00:04.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:30:33.157300 kernel: pci 0000:00:04.0: BAR 0 [mem 0x8438d000-0x8438dfff] Jan 14 01:30:33.157397 kernel: pci 0000:00:04.0: PCI bridge to [bus 12] Jan 14 01:30:33.157498 kernel: pci 0000:00:04.0: bridge window [mem 0x82000000-0x821fffff] Jan 14 01:30:33.157596 kernel: pci 0000:00:04.0: bridge window [mem 0x388000000000-0x3887ffffffff 64bit pref] Jan 14 01:30:33.157700 kernel: pci 0000:00:04.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:30:33.157796 kernel: pci 0000:00:04.1: BAR 0 [mem 0x8438c000-0x8438cfff] Jan 14 01:30:33.157891 kernel: pci 0000:00:04.1: PCI bridge to [bus 13] Jan 14 01:30:33.158021 kernel: pci 0000:00:04.1: bridge window [mem 0x81e00000-0x81ffffff] Jan 14 01:30:33.158124 kernel: pci 0000:00:04.1: bridge window [mem 0x388800000000-0x388fffffffff 64bit pref] Jan 14 01:30:33.158231 kernel: pci 0000:00:04.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:30:33.158327 kernel: pci 0000:00:04.2: BAR 0 [mem 0x8438b000-0x8438bfff] Jan 14 01:30:33.158423 kernel: pci 0000:00:04.2: PCI bridge to [bus 14] Jan 14 01:30:33.158520 kernel: pci 0000:00:04.2: bridge window [mem 0x81c00000-0x81dfffff] Jan 14 01:30:33.158617 kernel: pci 0000:00:04.2: bridge window [mem 0x389000000000-0x3897ffffffff 64bit pref] Jan 14 01:30:33.158724 kernel: pci 0000:00:04.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:30:33.158820 kernel: pci 0000:00:04.3: BAR 0 [mem 0x8438a000-0x8438afff] Jan 14 01:30:33.158916 kernel: pci 0000:00:04.3: PCI bridge to [bus 15] Jan 14 01:30:33.160160 kernel: pci 0000:00:04.3: bridge window [mem 0x81a00000-0x81bfffff] Jan 14 01:30:33.160269 kernel: pci 0000:00:04.3: bridge window [mem 0x389800000000-0x389fffffffff 64bit pref] Jan 14 01:30:33.160383 kernel: pci 0000:00:04.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:30:33.160490 kernel: pci 0000:00:04.4: BAR 0 [mem 0x84389000-0x84389fff] Jan 14 01:30:33.160587 kernel: pci 0000:00:04.4: PCI bridge to [bus 16] Jan 14 01:30:33.160681 kernel: pci 0000:00:04.4: bridge window [mem 0x81800000-0x819fffff] Jan 14 01:30:33.160778 kernel: pci 0000:00:04.4: bridge window [mem 0x38a000000000-0x38a7ffffffff 64bit pref] Jan 14 01:30:33.160882 kernel: pci 0000:00:04.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:30:33.160979 kernel: pci 0000:00:04.5: BAR 0 [mem 0x84388000-0x84388fff] Jan 14 01:30:33.161095 kernel: pci 0000:00:04.5: PCI bridge to [bus 17] Jan 14 01:30:33.161192 kernel: pci 0000:00:04.5: bridge window [mem 0x81600000-0x817fffff] Jan 14 01:30:33.161290 kernel: pci 0000:00:04.5: bridge window [mem 0x38a800000000-0x38afffffffff 64bit pref] Jan 14 01:30:33.161397 kernel: pci 0000:00:04.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:30:33.161498 kernel: pci 0000:00:04.6: BAR 0 [mem 0x84387000-0x84387fff] Jan 14 01:30:33.161594 kernel: pci 0000:00:04.6: PCI bridge to [bus 18] Jan 14 01:30:33.161689 kernel: pci 0000:00:04.6: bridge window [mem 0x81400000-0x815fffff] Jan 14 01:30:33.161785 kernel: pci 0000:00:04.6: bridge window [mem 0x38b000000000-0x38b7ffffffff 64bit pref] Jan 14 01:30:33.161889 kernel: pci 0000:00:04.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:30:33.162065 kernel: pci 0000:00:04.7: BAR 0 [mem 0x84386000-0x84386fff] Jan 14 01:30:33.162168 kernel: pci 0000:00:04.7: PCI bridge to [bus 19] Jan 14 01:30:33.162265 kernel: pci 0000:00:04.7: bridge window [mem 0x81200000-0x813fffff] Jan 14 01:30:33.162361 kernel: pci 0000:00:04.7: bridge window [mem 0x38b800000000-0x38bfffffffff 64bit pref] Jan 14 01:30:33.162465 kernel: pci 0000:00:05.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:30:33.162562 kernel: pci 0000:00:05.0: BAR 0 [mem 0x84385000-0x84385fff] Jan 14 01:30:33.162661 kernel: pci 0000:00:05.0: PCI bridge to [bus 1a] Jan 14 01:30:33.162756 kernel: pci 0000:00:05.0: bridge window [mem 0x81000000-0x811fffff] Jan 14 01:30:33.162852 kernel: pci 0000:00:05.0: bridge window [mem 0x38c000000000-0x38c7ffffffff 64bit pref] Jan 14 01:30:33.162954 kernel: pci 0000:00:05.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:30:33.163102 kernel: pci 0000:00:05.1: BAR 0 [mem 0x84384000-0x84384fff] Jan 14 01:30:33.163199 kernel: pci 0000:00:05.1: PCI bridge to [bus 1b] Jan 14 01:30:33.163299 kernel: pci 0000:00:05.1: bridge window [mem 0x80e00000-0x80ffffff] Jan 14 01:30:33.163395 kernel: pci 0000:00:05.1: bridge window [mem 0x38c800000000-0x38cfffffffff 64bit pref] Jan 14 01:30:33.163502 kernel: pci 0000:00:05.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:30:33.163599 kernel: pci 0000:00:05.2: BAR 0 [mem 0x84383000-0x84383fff] Jan 14 01:30:33.163695 kernel: pci 0000:00:05.2: PCI bridge to [bus 1c] Jan 14 01:30:33.163791 kernel: pci 0000:00:05.2: bridge window [mem 0x80c00000-0x80dfffff] Jan 14 01:30:33.163889 kernel: pci 0000:00:05.2: bridge window [mem 0x38d000000000-0x38d7ffffffff 64bit pref] Jan 14 01:30:33.164054 kernel: pci 0000:00:05.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:30:33.164152 kernel: pci 0000:00:05.3: BAR 0 [mem 0x84382000-0x84382fff] Jan 14 01:30:33.164248 kernel: pci 0000:00:05.3: PCI bridge to [bus 1d] Jan 14 01:30:33.164343 kernel: pci 0000:00:05.3: bridge window [mem 0x80a00000-0x80bfffff] Jan 14 01:30:33.164439 kernel: pci 0000:00:05.3: bridge window [mem 0x38d800000000-0x38dfffffffff 64bit pref] Jan 14 01:30:33.164545 kernel: pci 0000:00:05.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:30:33.164642 kernel: pci 0000:00:05.4: BAR 0 [mem 0x84381000-0x84381fff] Jan 14 01:30:33.164740 kernel: pci 0000:00:05.4: PCI bridge to [bus 1e] Jan 14 01:30:33.164836 kernel: pci 0000:00:05.4: bridge window [mem 0x80800000-0x809fffff] Jan 14 01:30:33.164931 kernel: pci 0000:00:05.4: bridge window [mem 0x38e000000000-0x38e7ffffffff 64bit pref] Jan 14 01:30:33.165048 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint Jan 14 01:30:33.165149 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Jan 14 01:30:33.165255 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint Jan 14 01:30:33.165352 kernel: pci 0000:00:1f.2: BAR 4 [io 0x7040-0x705f] Jan 14 01:30:33.165448 kernel: pci 0000:00:1f.2: BAR 5 [mem 0x84380000-0x84380fff] Jan 14 01:30:33.165554 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint Jan 14 01:30:33.165652 kernel: pci 0000:00:1f.3: BAR 4 [io 0x7000-0x703f] Jan 14 01:30:33.166305 kernel: pci 0000:01:00.0: [1b36:000e] type 01 class 0x060400 PCIe to PCI/PCI-X bridge Jan 14 01:30:33.166419 kernel: pci 0000:01:00.0: BAR 0 [mem 0x84200000-0x842000ff 64bit] Jan 14 01:30:33.166520 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Jan 14 01:30:33.166619 kernel: pci 0000:01:00.0: bridge window [io 0x6000-0x6fff] Jan 14 01:30:33.166716 kernel: pci 0000:01:00.0: bridge window [mem 0x84000000-0x841fffff] Jan 14 01:30:33.166818 kernel: pci 0000:01:00.0: bridge window [mem 0x380000000000-0x3807ffffffff 64bit pref] Jan 14 01:30:33.166919 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Jan 14 01:30:33.167054 kernel: pci_bus 0000:02: extended config space not accessible Jan 14 01:30:33.167067 kernel: acpiphp: Slot [1] registered Jan 14 01:30:33.167076 kernel: acpiphp: Slot [0] registered Jan 14 01:30:33.167084 kernel: acpiphp: Slot [2] registered Jan 14 01:30:33.167095 kernel: acpiphp: Slot [3] registered Jan 14 01:30:33.167104 kernel: acpiphp: Slot [4] registered Jan 14 01:30:33.167112 kernel: acpiphp: Slot [5] registered Jan 14 01:30:33.167121 kernel: acpiphp: Slot [6] registered Jan 14 01:30:33.167129 kernel: acpiphp: Slot [7] registered Jan 14 01:30:33.167137 kernel: acpiphp: Slot [8] registered Jan 14 01:30:33.167145 kernel: acpiphp: Slot [9] registered Jan 14 01:30:33.167156 kernel: acpiphp: Slot [10] registered Jan 14 01:30:33.167164 kernel: acpiphp: Slot [11] registered Jan 14 01:30:33.167173 kernel: acpiphp: Slot [12] registered Jan 14 01:30:33.167181 kernel: acpiphp: Slot [13] registered Jan 14 01:30:33.167189 kernel: acpiphp: Slot [14] registered Jan 14 01:30:33.167198 kernel: acpiphp: Slot [15] registered Jan 14 01:30:33.167206 kernel: acpiphp: Slot [16] registered Jan 14 01:30:33.167214 kernel: acpiphp: Slot [17] registered Jan 14 01:30:33.167224 kernel: acpiphp: Slot [18] registered Jan 14 01:30:33.167233 kernel: acpiphp: Slot [19] registered Jan 14 01:30:33.167241 kernel: acpiphp: Slot [20] registered Jan 14 01:30:33.167250 kernel: acpiphp: Slot [21] registered Jan 14 01:30:33.167258 kernel: acpiphp: Slot [22] registered Jan 14 01:30:33.167266 kernel: acpiphp: Slot [23] registered Jan 14 01:30:33.167275 kernel: acpiphp: Slot [24] registered Jan 14 01:30:33.167287 kernel: acpiphp: Slot [25] registered Jan 14 01:30:33.167295 kernel: acpiphp: Slot [26] registered Jan 14 01:30:33.167304 kernel: acpiphp: Slot [27] registered Jan 14 01:30:33.167312 kernel: acpiphp: Slot [28] registered Jan 14 01:30:33.167320 kernel: acpiphp: Slot [29] registered Jan 14 01:30:33.167328 kernel: acpiphp: Slot [30] registered Jan 14 01:30:33.167337 kernel: acpiphp: Slot [31] registered Jan 14 01:30:33.167452 kernel: pci 0000:02:01.0: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint Jan 14 01:30:33.167558 kernel: pci 0000:02:01.0: BAR 4 [io 0x6000-0x601f] Jan 14 01:30:33.167657 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Jan 14 01:30:33.167668 kernel: acpiphp: Slot [0-2] registered Jan 14 01:30:33.167772 kernel: pci 0000:03:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Jan 14 01:30:33.167872 kernel: pci 0000:03:00.0: BAR 1 [mem 0x83e00000-0x83e00fff] Jan 14 01:30:33.168004 kernel: pci 0000:03:00.0: BAR 4 [mem 0x380800000000-0x380800003fff 64bit pref] Jan 14 01:30:33.168107 kernel: pci 0000:03:00.0: ROM [mem 0xfff80000-0xffffffff pref] Jan 14 01:30:33.168207 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Jan 14 01:30:33.168218 kernel: acpiphp: Slot [0-3] registered Jan 14 01:30:33.168324 kernel: pci 0000:04:00.0: [1af4:1042] type 00 class 0x010000 PCIe Endpoint Jan 14 01:30:33.168425 kernel: pci 0000:04:00.0: BAR 1 [mem 0x83c00000-0x83c00fff] Jan 14 01:30:33.168527 kernel: pci 0000:04:00.0: BAR 4 [mem 0x381000000000-0x381000003fff 64bit pref] Jan 14 01:30:33.168626 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Jan 14 01:30:33.168637 kernel: acpiphp: Slot [0-4] registered Jan 14 01:30:33.168742 kernel: pci 0000:05:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Jan 14 01:30:33.168842 kernel: pci 0000:05:00.0: BAR 4 [mem 0x381800000000-0x381800003fff 64bit pref] Jan 14 01:30:33.168939 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Jan 14 01:30:33.168952 kernel: acpiphp: Slot [0-5] registered Jan 14 01:30:33.169070 kernel: pci 0000:06:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Jan 14 01:30:33.169169 kernel: pci 0000:06:00.0: BAR 1 [mem 0x83800000-0x83800fff] Jan 14 01:30:33.169268 kernel: pci 0000:06:00.0: BAR 4 [mem 0x382000000000-0x382000003fff 64bit pref] Jan 14 01:30:33.169366 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Jan 14 01:30:33.169377 kernel: acpiphp: Slot [0-6] registered Jan 14 01:30:33.169478 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Jan 14 01:30:33.169489 kernel: acpiphp: Slot [0-7] registered Jan 14 01:30:33.169586 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Jan 14 01:30:33.169597 kernel: acpiphp: Slot [0-8] registered Jan 14 01:30:33.169694 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Jan 14 01:30:33.169705 kernel: acpiphp: Slot [0-9] registered Jan 14 01:30:33.169801 kernel: pci 0000:00:03.0: PCI bridge to [bus 0a] Jan 14 01:30:33.169815 kernel: acpiphp: Slot [0-10] registered Jan 14 01:30:33.169915 kernel: pci 0000:00:03.1: PCI bridge to [bus 0b] Jan 14 01:30:33.169926 kernel: acpiphp: Slot [0-11] registered Jan 14 01:30:33.170035 kernel: pci 0000:00:03.2: PCI bridge to [bus 0c] Jan 14 01:30:33.170046 kernel: acpiphp: Slot [0-12] registered Jan 14 01:30:33.170144 kernel: pci 0000:00:03.3: PCI bridge to [bus 0d] Jan 14 01:30:33.170157 kernel: acpiphp: Slot [0-13] registered Jan 14 01:30:33.170254 kernel: pci 0000:00:03.4: PCI bridge to [bus 0e] Jan 14 01:30:33.170265 kernel: acpiphp: Slot [0-14] registered Jan 14 01:30:33.170362 kernel: pci 0000:00:03.5: PCI bridge to [bus 0f] Jan 14 01:30:33.170373 kernel: acpiphp: Slot [0-15] registered Jan 14 01:30:33.170470 kernel: pci 0000:00:03.6: PCI bridge to [bus 10] Jan 14 01:30:33.170483 kernel: acpiphp: Slot [0-16] registered Jan 14 01:30:33.170580 kernel: pci 0000:00:03.7: PCI bridge to [bus 11] Jan 14 01:30:33.170591 kernel: acpiphp: Slot [0-17] registered Jan 14 01:30:33.170689 kernel: pci 0000:00:04.0: PCI bridge to [bus 12] Jan 14 01:30:33.170701 kernel: acpiphp: Slot [0-18] registered Jan 14 01:30:33.170796 kernel: pci 0000:00:04.1: PCI bridge to [bus 13] Jan 14 01:30:33.170807 kernel: acpiphp: Slot [0-19] registered Jan 14 01:30:33.170905 kernel: pci 0000:00:04.2: PCI bridge to [bus 14] Jan 14 01:30:33.170917 kernel: acpiphp: Slot [0-20] registered Jan 14 01:30:33.171808 kernel: pci 0000:00:04.3: PCI bridge to [bus 15] Jan 14 01:30:33.171826 kernel: acpiphp: Slot [0-21] registered Jan 14 01:30:33.171941 kernel: pci 0000:00:04.4: PCI bridge to [bus 16] Jan 14 01:30:33.171953 kernel: acpiphp: Slot [0-22] registered Jan 14 01:30:33.172404 kernel: pci 0000:00:04.5: PCI bridge to [bus 17] Jan 14 01:30:33.172417 kernel: acpiphp: Slot [0-23] registered Jan 14 01:30:33.172515 kernel: pci 0000:00:04.6: PCI bridge to [bus 18] Jan 14 01:30:33.172526 kernel: acpiphp: Slot [0-24] registered Jan 14 01:30:33.172624 kernel: pci 0000:00:04.7: PCI bridge to [bus 19] Jan 14 01:30:33.172635 kernel: acpiphp: Slot [0-25] registered Jan 14 01:30:33.172732 kernel: pci 0000:00:05.0: PCI bridge to [bus 1a] Jan 14 01:30:33.172747 kernel: acpiphp: Slot [0-26] registered Jan 14 01:30:33.172846 kernel: pci 0000:00:05.1: PCI bridge to [bus 1b] Jan 14 01:30:33.172857 kernel: acpiphp: Slot [0-27] registered Jan 14 01:30:33.172954 kernel: pci 0000:00:05.2: PCI bridge to [bus 1c] Jan 14 01:30:33.172965 kernel: acpiphp: Slot [0-28] registered Jan 14 01:30:33.173072 kernel: pci 0000:00:05.3: PCI bridge to [bus 1d] Jan 14 01:30:33.173085 kernel: acpiphp: Slot [0-29] registered Jan 14 01:30:33.173183 kernel: pci 0000:00:05.4: PCI bridge to [bus 1e] Jan 14 01:30:33.173194 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Jan 14 01:30:33.173203 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Jan 14 01:30:33.173211 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Jan 14 01:30:33.173220 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Jan 14 01:30:33.173229 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Jan 14 01:30:33.173240 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Jan 14 01:30:33.173249 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Jan 14 01:30:33.173257 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Jan 14 01:30:33.173266 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Jan 14 01:30:33.173274 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Jan 14 01:30:33.173283 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Jan 14 01:30:33.173291 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Jan 14 01:30:33.173301 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Jan 14 01:30:33.173310 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Jan 14 01:30:33.173318 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Jan 14 01:30:33.173327 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Jan 14 01:30:33.173335 kernel: iommu: Default domain type: Translated Jan 14 01:30:33.173344 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Jan 14 01:30:33.173352 kernel: efivars: Registered efivars operations Jan 14 01:30:33.173363 kernel: PCI: Using ACPI for IRQ routing Jan 14 01:30:33.173371 kernel: PCI: pci_cache_line_size set to 64 bytes Jan 14 01:30:33.173380 kernel: e820: reserve RAM buffer [mem 0x0080b000-0x008fffff] Jan 14 01:30:33.173388 kernel: e820: reserve RAM buffer [mem 0x00811000-0x008fffff] Jan 14 01:30:33.173397 kernel: e820: reserve RAM buffer [mem 0x7df57018-0x7fffffff] Jan 14 01:30:33.173405 kernel: e820: reserve RAM buffer [mem 0x7df7f018-0x7fffffff] Jan 14 01:30:33.173413 kernel: e820: reserve RAM buffer [mem 0x7e93f000-0x7fffffff] Jan 14 01:30:33.173423 kernel: e820: reserve RAM buffer [mem 0x7ec71000-0x7fffffff] Jan 14 01:30:33.173431 kernel: e820: reserve RAM buffer [mem 0x7f8ed000-0x7fffffff] Jan 14 01:30:33.173439 kernel: e820: reserve RAM buffer [mem 0x7feaf000-0x7fffffff] Jan 14 01:30:33.173448 kernel: e820: reserve RAM buffer [mem 0x7feec000-0x7fffffff] Jan 14 01:30:33.175798 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Jan 14 01:30:33.175904 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Jan 14 01:30:33.176029 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Jan 14 01:30:33.176045 kernel: vgaarb: loaded Jan 14 01:30:33.176054 kernel: clocksource: Switched to clocksource kvm-clock Jan 14 01:30:33.176063 kernel: VFS: Disk quotas dquot_6.6.0 Jan 14 01:30:33.176073 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Jan 14 01:30:33.176081 kernel: pnp: PnP ACPI init Jan 14 01:30:33.176197 kernel: system 00:04: [mem 0xe0000000-0xefffffff window] has been reserved Jan 14 01:30:33.176212 kernel: pnp: PnP ACPI: found 5 devices Jan 14 01:30:33.176221 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Jan 14 01:30:33.176229 kernel: NET: Registered PF_INET protocol family Jan 14 01:30:33.176238 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Jan 14 01:30:33.176247 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Jan 14 01:30:33.176256 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Jan 14 01:30:33.176264 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Jan 14 01:30:33.176275 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Jan 14 01:30:33.176283 kernel: TCP: Hash tables configured (established 32768 bind 32768) Jan 14 01:30:33.176292 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Jan 14 01:30:33.176301 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Jan 14 01:30:33.176309 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Jan 14 01:30:33.176318 kernel: NET: Registered PF_XDP protocol family Jan 14 01:30:33.176423 kernel: pci 0000:03:00.0: ROM [mem 0xfff80000-0xffffffff pref]: can't claim; no compatible bridge window Jan 14 01:30:33.176526 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Jan 14 01:30:33.176626 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Jan 14 01:30:33.176726 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Jan 14 01:30:33.176825 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Jan 14 01:30:33.176924 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Jan 14 01:30:33.177035 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Jan 14 01:30:33.177135 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Jan 14 01:30:33.177238 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Jan 14 01:30:33.177338 kernel: pci 0000:00:03.1: bridge window [io 0x1000-0x0fff] to [bus 0b] add_size 1000 Jan 14 01:30:33.177438 kernel: pci 0000:00:03.2: bridge window [io 0x1000-0x0fff] to [bus 0c] add_size 1000 Jan 14 01:30:33.177537 kernel: pci 0000:00:03.3: bridge window [io 0x1000-0x0fff] to [bus 0d] add_size 1000 Jan 14 01:30:33.177636 kernel: pci 0000:00:03.4: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Jan 14 01:30:33.177735 kernel: pci 0000:00:03.5: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Jan 14 01:30:33.177844 kernel: pci 0000:00:03.6: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Jan 14 01:30:33.177948 kernel: pci 0000:00:03.7: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Jan 14 01:30:33.179735 kernel: pci 0000:00:04.0: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Jan 14 01:30:33.179841 kernel: pci 0000:00:04.1: bridge window [io 0x1000-0x0fff] to [bus 13] add_size 1000 Jan 14 01:30:33.179960 kernel: pci 0000:00:04.2: bridge window [io 0x1000-0x0fff] to [bus 14] add_size 1000 Jan 14 01:30:33.180069 kernel: pci 0000:00:04.3: bridge window [io 0x1000-0x0fff] to [bus 15] add_size 1000 Jan 14 01:30:33.180174 kernel: pci 0000:00:04.4: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Jan 14 01:30:33.180276 kernel: pci 0000:00:04.5: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Jan 14 01:30:33.180376 kernel: pci 0000:00:04.6: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Jan 14 01:30:33.180477 kernel: pci 0000:00:04.7: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Jan 14 01:30:33.180578 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Jan 14 01:30:33.180678 kernel: pci 0000:00:05.1: bridge window [io 0x1000-0x0fff] to [bus 1b] add_size 1000 Jan 14 01:30:33.180780 kernel: pci 0000:00:05.2: bridge window [io 0x1000-0x0fff] to [bus 1c] add_size 1000 Jan 14 01:30:33.180882 kernel: pci 0000:00:05.3: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Jan 14 01:30:33.180995 kernel: pci 0000:00:05.4: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Jan 14 01:30:33.181104 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x1fff]: assigned Jan 14 01:30:33.182034 kernel: pci 0000:00:02.2: bridge window [io 0x2000-0x2fff]: assigned Jan 14 01:30:33.182164 kernel: pci 0000:00:02.3: bridge window [io 0x3000-0x3fff]: assigned Jan 14 01:30:33.182273 kernel: pci 0000:00:02.4: bridge window [io 0x4000-0x4fff]: assigned Jan 14 01:30:33.182378 kernel: pci 0000:00:02.5: bridge window [io 0x5000-0x5fff]: assigned Jan 14 01:30:33.182478 kernel: pci 0000:00:02.6: bridge window [io 0x8000-0x8fff]: assigned Jan 14 01:30:33.182576 kernel: pci 0000:00:02.7: bridge window [io 0x9000-0x9fff]: assigned Jan 14 01:30:33.182673 kernel: pci 0000:00:03.0: bridge window [io 0xa000-0xafff]: assigned Jan 14 01:30:33.182771 kernel: pci 0000:00:03.1: bridge window [io 0xb000-0xbfff]: assigned Jan 14 01:30:33.182870 kernel: pci 0000:00:03.2: bridge window [io 0xc000-0xcfff]: assigned Jan 14 01:30:33.182971 kernel: pci 0000:00:03.3: bridge window [io 0xd000-0xdfff]: assigned Jan 14 01:30:33.183078 kernel: pci 0000:00:03.4: bridge window [io 0xe000-0xefff]: assigned Jan 14 01:30:33.183177 kernel: pci 0000:00:03.5: bridge window [io 0xf000-0xffff]: assigned Jan 14 01:30:33.183275 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:30:33.183370 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: failed to assign Jan 14 01:30:33.183467 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:30:33.183563 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: failed to assign Jan 14 01:30:33.183662 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:30:33.183757 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: failed to assign Jan 14 01:30:33.183854 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:30:33.183959 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: failed to assign Jan 14 01:30:33.184065 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:30:33.184160 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: failed to assign Jan 14 01:30:33.184259 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:30:33.184354 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: failed to assign Jan 14 01:30:33.184452 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:30:33.184547 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: failed to assign Jan 14 01:30:33.184644 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:30:33.184740 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: failed to assign Jan 14 01:30:33.184838 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:30:33.184936 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: failed to assign Jan 14 01:30:33.185048 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:30:33.185145 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: failed to assign Jan 14 01:30:33.185243 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:30:33.185340 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: failed to assign Jan 14 01:30:33.185438 kernel: pci 0000:00:05.1: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:30:33.185536 kernel: pci 0000:00:05.1: bridge window [io size 0x1000]: failed to assign Jan 14 01:30:33.185634 kernel: pci 0000:00:05.2: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:30:33.185729 kernel: pci 0000:00:05.2: bridge window [io size 0x1000]: failed to assign Jan 14 01:30:33.185829 kernel: pci 0000:00:05.3: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:30:33.185924 kernel: pci 0000:00:05.3: bridge window [io size 0x1000]: failed to assign Jan 14 01:30:33.186030 kernel: pci 0000:00:05.4: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:30:33.186128 kernel: pci 0000:00:05.4: bridge window [io size 0x1000]: failed to assign Jan 14 01:30:33.186223 kernel: pci 0000:00:05.4: bridge window [io 0x1000-0x1fff]: assigned Jan 14 01:30:33.186318 kernel: pci 0000:00:05.3: bridge window [io 0x2000-0x2fff]: assigned Jan 14 01:30:33.187317 kernel: pci 0000:00:05.2: bridge window [io 0x3000-0x3fff]: assigned Jan 14 01:30:33.187431 kernel: pci 0000:00:05.1: bridge window [io 0x4000-0x4fff]: assigned Jan 14 01:30:33.187528 kernel: pci 0000:00:05.0: bridge window [io 0x5000-0x5fff]: assigned Jan 14 01:30:33.187626 kernel: pci 0000:00:04.7: bridge window [io 0x8000-0x8fff]: assigned Jan 14 01:30:33.187727 kernel: pci 0000:00:04.6: bridge window [io 0x9000-0x9fff]: assigned Jan 14 01:30:33.187825 kernel: pci 0000:00:04.5: bridge window [io 0xa000-0xafff]: assigned Jan 14 01:30:33.187933 kernel: pci 0000:00:04.4: bridge window [io 0xb000-0xbfff]: assigned Jan 14 01:30:33.188041 kernel: pci 0000:00:04.3: bridge window [io 0xc000-0xcfff]: assigned Jan 14 01:30:33.188138 kernel: pci 0000:00:04.2: bridge window [io 0xd000-0xdfff]: assigned Jan 14 01:30:33.188234 kernel: pci 0000:00:04.1: bridge window [io 0xe000-0xefff]: assigned Jan 14 01:30:33.188333 kernel: pci 0000:00:04.0: bridge window [io 0xf000-0xffff]: assigned Jan 14 01:30:33.188429 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:30:33.188525 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: failed to assign Jan 14 01:30:33.188623 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:30:33.188718 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: failed to assign Jan 14 01:30:33.188815 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:30:33.188911 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: failed to assign Jan 14 01:30:33.189018 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:30:33.189115 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: failed to assign Jan 14 01:30:33.189213 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:30:33.189308 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: failed to assign Jan 14 01:30:33.189404 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:30:33.189500 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: failed to assign Jan 14 01:30:33.189600 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:30:33.189695 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: failed to assign Jan 14 01:30:33.189792 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:30:33.189887 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: failed to assign Jan 14 01:30:33.189983 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:30:33.190089 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: failed to assign Jan 14 01:30:33.190186 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:30:33.190285 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: failed to assign Jan 14 01:30:33.190381 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:30:33.190476 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: failed to assign Jan 14 01:30:33.190572 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:30:33.190668 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: failed to assign Jan 14 01:30:33.190766 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:30:33.190865 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: failed to assign Jan 14 01:30:33.190961 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:30:33.191064 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: failed to assign Jan 14 01:30:33.191161 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:30:33.191257 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: failed to assign Jan 14 01:30:33.191361 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Jan 14 01:30:33.191459 kernel: pci 0000:01:00.0: bridge window [io 0x6000-0x6fff] Jan 14 01:30:33.191559 kernel: pci 0000:01:00.0: bridge window [mem 0x84000000-0x841fffff] Jan 14 01:30:33.191656 kernel: pci 0000:01:00.0: bridge window [mem 0x380000000000-0x3807ffffffff 64bit pref] Jan 14 01:30:33.191758 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Jan 14 01:30:33.191855 kernel: pci 0000:00:02.0: bridge window [io 0x6000-0x6fff] Jan 14 01:30:33.191957 kernel: pci 0000:00:02.0: bridge window [mem 0x84000000-0x842fffff] Jan 14 01:30:33.195196 kernel: pci 0000:00:02.0: bridge window [mem 0x380000000000-0x3807ffffffff 64bit pref] Jan 14 01:30:33.195307 kernel: pci 0000:03:00.0: ROM [mem 0x83e80000-0x83efffff pref]: assigned Jan 14 01:30:33.195410 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Jan 14 01:30:33.195506 kernel: pci 0000:00:02.1: bridge window [mem 0x83e00000-0x83ffffff] Jan 14 01:30:33.195601 kernel: pci 0000:00:02.1: bridge window [mem 0x380800000000-0x380fffffffff 64bit pref] Jan 14 01:30:33.195697 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Jan 14 01:30:33.195793 kernel: pci 0000:00:02.2: bridge window [mem 0x83c00000-0x83dfffff] Jan 14 01:30:33.195888 kernel: pci 0000:00:02.2: bridge window [mem 0x381000000000-0x3817ffffffff 64bit pref] Jan 14 01:30:33.196023 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Jan 14 01:30:33.196120 kernel: pci 0000:00:02.3: bridge window [mem 0x83a00000-0x83bfffff] Jan 14 01:30:33.196217 kernel: pci 0000:00:02.3: bridge window [mem 0x381800000000-0x381fffffffff 64bit pref] Jan 14 01:30:33.196313 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Jan 14 01:30:33.196409 kernel: pci 0000:00:02.4: bridge window [mem 0x83800000-0x839fffff] Jan 14 01:30:33.196504 kernel: pci 0000:00:02.4: bridge window [mem 0x382000000000-0x3827ffffffff 64bit pref] Jan 14 01:30:33.196600 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Jan 14 01:30:33.196696 kernel: pci 0000:00:02.5: bridge window [mem 0x83600000-0x837fffff] Jan 14 01:30:33.196791 kernel: pci 0000:00:02.5: bridge window [mem 0x382800000000-0x382fffffffff 64bit pref] Jan 14 01:30:33.196891 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Jan 14 01:30:33.196996 kernel: pci 0000:00:02.6: bridge window [mem 0x83400000-0x835fffff] Jan 14 01:30:33.197093 kernel: pci 0000:00:02.6: bridge window [mem 0x383000000000-0x3837ffffffff 64bit pref] Jan 14 01:30:33.197189 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Jan 14 01:30:33.197283 kernel: pci 0000:00:02.7: bridge window [mem 0x83200000-0x833fffff] Jan 14 01:30:33.197380 kernel: pci 0000:00:02.7: bridge window [mem 0x383800000000-0x383fffffffff 64bit pref] Jan 14 01:30:33.197475 kernel: pci 0000:00:03.0: PCI bridge to [bus 0a] Jan 14 01:30:33.197570 kernel: pci 0000:00:03.0: bridge window [mem 0x83000000-0x831fffff] Jan 14 01:30:33.197665 kernel: pci 0000:00:03.0: bridge window [mem 0x384000000000-0x3847ffffffff 64bit pref] Jan 14 01:30:33.197761 kernel: pci 0000:00:03.1: PCI bridge to [bus 0b] Jan 14 01:30:33.197856 kernel: pci 0000:00:03.1: bridge window [mem 0x82e00000-0x82ffffff] Jan 14 01:30:33.197951 kernel: pci 0000:00:03.1: bridge window [mem 0x384800000000-0x384fffffffff 64bit pref] Jan 14 01:30:33.198062 kernel: pci 0000:00:03.2: PCI bridge to [bus 0c] Jan 14 01:30:33.198162 kernel: pci 0000:00:03.2: bridge window [mem 0x82c00000-0x82dfffff] Jan 14 01:30:33.198257 kernel: pci 0000:00:03.2: bridge window [mem 0x385000000000-0x3857ffffffff 64bit pref] Jan 14 01:30:33.198352 kernel: pci 0000:00:03.3: PCI bridge to [bus 0d] Jan 14 01:30:33.198449 kernel: pci 0000:00:03.3: bridge window [mem 0x82a00000-0x82bfffff] Jan 14 01:30:33.198544 kernel: pci 0000:00:03.3: bridge window [mem 0x385800000000-0x385fffffffff 64bit pref] Jan 14 01:30:33.198639 kernel: pci 0000:00:03.4: PCI bridge to [bus 0e] Jan 14 01:30:33.198735 kernel: pci 0000:00:03.4: bridge window [mem 0x82800000-0x829fffff] Jan 14 01:30:33.198832 kernel: pci 0000:00:03.4: bridge window [mem 0x386000000000-0x3867ffffffff 64bit pref] Jan 14 01:30:33.198927 kernel: pci 0000:00:03.5: PCI bridge to [bus 0f] Jan 14 01:30:33.199031 kernel: pci 0000:00:03.5: bridge window [mem 0x82600000-0x827fffff] Jan 14 01:30:33.199127 kernel: pci 0000:00:03.5: bridge window [mem 0x386800000000-0x386fffffffff 64bit pref] Jan 14 01:30:33.199224 kernel: pci 0000:00:03.6: PCI bridge to [bus 10] Jan 14 01:30:33.199319 kernel: pci 0000:00:03.6: bridge window [mem 0x82400000-0x825fffff] Jan 14 01:30:33.199415 kernel: pci 0000:00:03.6: bridge window [mem 0x387000000000-0x3877ffffffff 64bit pref] Jan 14 01:30:33.199514 kernel: pci 0000:00:03.7: PCI bridge to [bus 11] Jan 14 01:30:33.201132 kernel: pci 0000:00:03.7: bridge window [mem 0x82200000-0x823fffff] Jan 14 01:30:33.201246 kernel: pci 0000:00:03.7: bridge window [mem 0x387800000000-0x387fffffffff 64bit pref] Jan 14 01:30:33.201348 kernel: pci 0000:00:04.0: PCI bridge to [bus 12] Jan 14 01:30:33.201446 kernel: pci 0000:00:04.0: bridge window [io 0xf000-0xffff] Jan 14 01:30:33.201543 kernel: pci 0000:00:04.0: bridge window [mem 0x82000000-0x821fffff] Jan 14 01:30:33.201640 kernel: pci 0000:00:04.0: bridge window [mem 0x388000000000-0x3887ffffffff 64bit pref] Jan 14 01:30:33.201741 kernel: pci 0000:00:04.1: PCI bridge to [bus 13] Jan 14 01:30:33.201837 kernel: pci 0000:00:04.1: bridge window [io 0xe000-0xefff] Jan 14 01:30:33.201933 kernel: pci 0000:00:04.1: bridge window [mem 0x81e00000-0x81ffffff] Jan 14 01:30:33.202048 kernel: pci 0000:00:04.1: bridge window [mem 0x388800000000-0x388fffffffff 64bit pref] Jan 14 01:30:33.202147 kernel: pci 0000:00:04.2: PCI bridge to [bus 14] Jan 14 01:30:33.202242 kernel: pci 0000:00:04.2: bridge window [io 0xd000-0xdfff] Jan 14 01:30:33.202340 kernel: pci 0000:00:04.2: bridge window [mem 0x81c00000-0x81dfffff] Jan 14 01:30:33.202435 kernel: pci 0000:00:04.2: bridge window [mem 0x389000000000-0x3897ffffffff 64bit pref] Jan 14 01:30:33.202531 kernel: pci 0000:00:04.3: PCI bridge to [bus 15] Jan 14 01:30:33.202627 kernel: pci 0000:00:04.3: bridge window [io 0xc000-0xcfff] Jan 14 01:30:33.202723 kernel: pci 0000:00:04.3: bridge window [mem 0x81a00000-0x81bfffff] Jan 14 01:30:33.202819 kernel: pci 0000:00:04.3: bridge window [mem 0x389800000000-0x389fffffffff 64bit pref] Jan 14 01:30:33.202918 kernel: pci 0000:00:04.4: PCI bridge to [bus 16] Jan 14 01:30:33.203025 kernel: pci 0000:00:04.4: bridge window [io 0xb000-0xbfff] Jan 14 01:30:33.203121 kernel: pci 0000:00:04.4: bridge window [mem 0x81800000-0x819fffff] Jan 14 01:30:33.203215 kernel: pci 0000:00:04.4: bridge window [mem 0x38a000000000-0x38a7ffffffff 64bit pref] Jan 14 01:30:33.203313 kernel: pci 0000:00:04.5: PCI bridge to [bus 17] Jan 14 01:30:33.203408 kernel: pci 0000:00:04.5: bridge window [io 0xa000-0xafff] Jan 14 01:30:33.203505 kernel: pci 0000:00:04.5: bridge window [mem 0x81600000-0x817fffff] Jan 14 01:30:33.203599 kernel: pci 0000:00:04.5: bridge window [mem 0x38a800000000-0x38afffffffff 64bit pref] Jan 14 01:30:33.203696 kernel: pci 0000:00:04.6: PCI bridge to [bus 18] Jan 14 01:30:33.203791 kernel: pci 0000:00:04.6: bridge window [io 0x9000-0x9fff] Jan 14 01:30:33.203887 kernel: pci 0000:00:04.6: bridge window [mem 0x81400000-0x815fffff] Jan 14 01:30:33.206036 kernel: pci 0000:00:04.6: bridge window [mem 0x38b000000000-0x38b7ffffffff 64bit pref] Jan 14 01:30:33.206170 kernel: pci 0000:00:04.7: PCI bridge to [bus 19] Jan 14 01:30:33.206272 kernel: pci 0000:00:04.7: bridge window [io 0x8000-0x8fff] Jan 14 01:30:33.206370 kernel: pci 0000:00:04.7: bridge window [mem 0x81200000-0x813fffff] Jan 14 01:30:33.206467 kernel: pci 0000:00:04.7: bridge window [mem 0x38b800000000-0x38bfffffffff 64bit pref] Jan 14 01:30:33.206566 kernel: pci 0000:00:05.0: PCI bridge to [bus 1a] Jan 14 01:30:33.206662 kernel: pci 0000:00:05.0: bridge window [io 0x5000-0x5fff] Jan 14 01:30:33.206760 kernel: pci 0000:00:05.0: bridge window [mem 0x81000000-0x811fffff] Jan 14 01:30:33.206856 kernel: pci 0000:00:05.0: bridge window [mem 0x38c000000000-0x38c7ffffffff 64bit pref] Jan 14 01:30:33.206955 kernel: pci 0000:00:05.1: PCI bridge to [bus 1b] Jan 14 01:30:33.207061 kernel: pci 0000:00:05.1: bridge window [io 0x4000-0x4fff] Jan 14 01:30:33.207158 kernel: pci 0000:00:05.1: bridge window [mem 0x80e00000-0x80ffffff] Jan 14 01:30:33.207254 kernel: pci 0000:00:05.1: bridge window [mem 0x38c800000000-0x38cfffffffff 64bit pref] Jan 14 01:30:33.207354 kernel: pci 0000:00:05.2: PCI bridge to [bus 1c] Jan 14 01:30:33.207450 kernel: pci 0000:00:05.2: bridge window [io 0x3000-0x3fff] Jan 14 01:30:33.207545 kernel: pci 0000:00:05.2: bridge window [mem 0x80c00000-0x80dfffff] Jan 14 01:30:33.207640 kernel: pci 0000:00:05.2: bridge window [mem 0x38d000000000-0x38d7ffffffff 64bit pref] Jan 14 01:30:33.207739 kernel: pci 0000:00:05.3: PCI bridge to [bus 1d] Jan 14 01:30:33.207835 kernel: pci 0000:00:05.3: bridge window [io 0x2000-0x2fff] Jan 14 01:30:33.207943 kernel: pci 0000:00:05.3: bridge window [mem 0x80a00000-0x80bfffff] Jan 14 01:30:33.208047 kernel: pci 0000:00:05.3: bridge window [mem 0x38d800000000-0x38dfffffffff 64bit pref] Jan 14 01:30:33.208146 kernel: pci 0000:00:05.4: PCI bridge to [bus 1e] Jan 14 01:30:33.208241 kernel: pci 0000:00:05.4: bridge window [io 0x1000-0x1fff] Jan 14 01:30:33.208336 kernel: pci 0000:00:05.4: bridge window [mem 0x80800000-0x809fffff] Jan 14 01:30:33.208431 kernel: pci 0000:00:05.4: bridge window [mem 0x38e000000000-0x38e7ffffffff 64bit pref] Jan 14 01:30:33.208534 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Jan 14 01:30:33.208622 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Jan 14 01:30:33.208709 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Jan 14 01:30:33.208795 kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xdfffffff window] Jan 14 01:30:33.208882 kernel: pci_bus 0000:00: resource 8 [mem 0xf0000000-0xfebfffff window] Jan 14 01:30:33.208968 kernel: pci_bus 0000:00: resource 9 [mem 0x380000000000-0x38e800003fff window] Jan 14 01:30:33.209079 kernel: pci_bus 0000:01: resource 0 [io 0x6000-0x6fff] Jan 14 01:30:33.209170 kernel: pci_bus 0000:01: resource 1 [mem 0x84000000-0x842fffff] Jan 14 01:30:33.209259 kernel: pci_bus 0000:01: resource 2 [mem 0x380000000000-0x3807ffffffff 64bit pref] Jan 14 01:30:33.209357 kernel: pci_bus 0000:02: resource 0 [io 0x6000-0x6fff] Jan 14 01:30:33.209450 kernel: pci_bus 0000:02: resource 1 [mem 0x84000000-0x841fffff] Jan 14 01:30:33.209542 kernel: pci_bus 0000:02: resource 2 [mem 0x380000000000-0x3807ffffffff 64bit pref] Jan 14 01:30:33.209641 kernel: pci_bus 0000:03: resource 1 [mem 0x83e00000-0x83ffffff] Jan 14 01:30:33.209730 kernel: pci_bus 0000:03: resource 2 [mem 0x380800000000-0x380fffffffff 64bit pref] Jan 14 01:30:33.209831 kernel: pci_bus 0000:04: resource 1 [mem 0x83c00000-0x83dfffff] Jan 14 01:30:33.209921 kernel: pci_bus 0000:04: resource 2 [mem 0x381000000000-0x3817ffffffff 64bit pref] Jan 14 01:30:33.210721 kernel: pci_bus 0000:05: resource 1 [mem 0x83a00000-0x83bfffff] Jan 14 01:30:33.210825 kernel: pci_bus 0000:05: resource 2 [mem 0x381800000000-0x381fffffffff 64bit pref] Jan 14 01:30:33.210923 kernel: pci_bus 0000:06: resource 1 [mem 0x83800000-0x839fffff] Jan 14 01:30:33.211030 kernel: pci_bus 0000:06: resource 2 [mem 0x382000000000-0x3827ffffffff 64bit pref] Jan 14 01:30:33.211129 kernel: pci_bus 0000:07: resource 1 [mem 0x83600000-0x837fffff] Jan 14 01:30:33.211219 kernel: pci_bus 0000:07: resource 2 [mem 0x382800000000-0x382fffffffff 64bit pref] Jan 14 01:30:33.211319 kernel: pci_bus 0000:08: resource 1 [mem 0x83400000-0x835fffff] Jan 14 01:30:33.211411 kernel: pci_bus 0000:08: resource 2 [mem 0x383000000000-0x3837ffffffff 64bit pref] Jan 14 01:30:33.211508 kernel: pci_bus 0000:09: resource 1 [mem 0x83200000-0x833fffff] Jan 14 01:30:33.211597 kernel: pci_bus 0000:09: resource 2 [mem 0x383800000000-0x383fffffffff 64bit pref] Jan 14 01:30:33.211695 kernel: pci_bus 0000:0a: resource 1 [mem 0x83000000-0x831fffff] Jan 14 01:30:33.211785 kernel: pci_bus 0000:0a: resource 2 [mem 0x384000000000-0x3847ffffffff 64bit pref] Jan 14 01:30:33.211885 kernel: pci_bus 0000:0b: resource 1 [mem 0x82e00000-0x82ffffff] Jan 14 01:30:33.211996 kernel: pci_bus 0000:0b: resource 2 [mem 0x384800000000-0x384fffffffff 64bit pref] Jan 14 01:30:33.212098 kernel: pci_bus 0000:0c: resource 1 [mem 0x82c00000-0x82dfffff] Jan 14 01:30:33.212188 kernel: pci_bus 0000:0c: resource 2 [mem 0x385000000000-0x3857ffffffff 64bit pref] Jan 14 01:30:33.212297 kernel: pci_bus 0000:0d: resource 1 [mem 0x82a00000-0x82bfffff] Jan 14 01:30:33.212387 kernel: pci_bus 0000:0d: resource 2 [mem 0x385800000000-0x385fffffffff 64bit pref] Jan 14 01:30:33.212485 kernel: pci_bus 0000:0e: resource 1 [mem 0x82800000-0x829fffff] Jan 14 01:30:33.212575 kernel: pci_bus 0000:0e: resource 2 [mem 0x386000000000-0x3867ffffffff 64bit pref] Jan 14 01:30:33.212677 kernel: pci_bus 0000:0f: resource 1 [mem 0x82600000-0x827fffff] Jan 14 01:30:33.212778 kernel: pci_bus 0000:0f: resource 2 [mem 0x386800000000-0x386fffffffff 64bit pref] Jan 14 01:30:33.212884 kernel: pci_bus 0000:10: resource 1 [mem 0x82400000-0x825fffff] Jan 14 01:30:33.212974 kernel: pci_bus 0000:10: resource 2 [mem 0x387000000000-0x3877ffffffff 64bit pref] Jan 14 01:30:33.213644 kernel: pci_bus 0000:11: resource 1 [mem 0x82200000-0x823fffff] Jan 14 01:30:33.213740 kernel: pci_bus 0000:11: resource 2 [mem 0x387800000000-0x387fffffffff 64bit pref] Jan 14 01:30:33.213840 kernel: pci_bus 0000:12: resource 0 [io 0xf000-0xffff] Jan 14 01:30:33.213934 kernel: pci_bus 0000:12: resource 1 [mem 0x82000000-0x821fffff] Jan 14 01:30:33.216076 kernel: pci_bus 0000:12: resource 2 [mem 0x388000000000-0x3887ffffffff 64bit pref] Jan 14 01:30:33.216193 kernel: pci_bus 0000:13: resource 0 [io 0xe000-0xefff] Jan 14 01:30:33.216289 kernel: pci_bus 0000:13: resource 1 [mem 0x81e00000-0x81ffffff] Jan 14 01:30:33.216380 kernel: pci_bus 0000:13: resource 2 [mem 0x388800000000-0x388fffffffff 64bit pref] Jan 14 01:30:33.216482 kernel: pci_bus 0000:14: resource 0 [io 0xd000-0xdfff] Jan 14 01:30:33.216574 kernel: pci_bus 0000:14: resource 1 [mem 0x81c00000-0x81dfffff] Jan 14 01:30:33.216663 kernel: pci_bus 0000:14: resource 2 [mem 0x389000000000-0x3897ffffffff 64bit pref] Jan 14 01:30:33.216767 kernel: pci_bus 0000:15: resource 0 [io 0xc000-0xcfff] Jan 14 01:30:33.216857 kernel: pci_bus 0000:15: resource 1 [mem 0x81a00000-0x81bfffff] Jan 14 01:30:33.216946 kernel: pci_bus 0000:15: resource 2 [mem 0x389800000000-0x389fffffffff 64bit pref] Jan 14 01:30:33.217066 kernel: pci_bus 0000:16: resource 0 [io 0xb000-0xbfff] Jan 14 01:30:33.217157 kernel: pci_bus 0000:16: resource 1 [mem 0x81800000-0x819fffff] Jan 14 01:30:33.217247 kernel: pci_bus 0000:16: resource 2 [mem 0x38a000000000-0x38a7ffffffff 64bit pref] Jan 14 01:30:33.217343 kernel: pci_bus 0000:17: resource 0 [io 0xa000-0xafff] Jan 14 01:30:33.217433 kernel: pci_bus 0000:17: resource 1 [mem 0x81600000-0x817fffff] Jan 14 01:30:33.217525 kernel: pci_bus 0000:17: resource 2 [mem 0x38a800000000-0x38afffffffff 64bit pref] Jan 14 01:30:33.217624 kernel: pci_bus 0000:18: resource 0 [io 0x9000-0x9fff] Jan 14 01:30:33.217714 kernel: pci_bus 0000:18: resource 1 [mem 0x81400000-0x815fffff] Jan 14 01:30:33.217804 kernel: pci_bus 0000:18: resource 2 [mem 0x38b000000000-0x38b7ffffffff 64bit pref] Jan 14 01:30:33.217903 kernel: pci_bus 0000:19: resource 0 [io 0x8000-0x8fff] Jan 14 01:30:33.219718 kernel: pci_bus 0000:19: resource 1 [mem 0x81200000-0x813fffff] Jan 14 01:30:33.219831 kernel: pci_bus 0000:19: resource 2 [mem 0x38b800000000-0x38bfffffffff 64bit pref] Jan 14 01:30:33.219943 kernel: pci_bus 0000:1a: resource 0 [io 0x5000-0x5fff] Jan 14 01:30:33.220046 kernel: pci_bus 0000:1a: resource 1 [mem 0x81000000-0x811fffff] Jan 14 01:30:33.220136 kernel: pci_bus 0000:1a: resource 2 [mem 0x38c000000000-0x38c7ffffffff 64bit pref] Jan 14 01:30:33.220234 kernel: pci_bus 0000:1b: resource 0 [io 0x4000-0x4fff] Jan 14 01:30:33.220325 kernel: pci_bus 0000:1b: resource 1 [mem 0x80e00000-0x80ffffff] Jan 14 01:30:33.220419 kernel: pci_bus 0000:1b: resource 2 [mem 0x38c800000000-0x38cfffffffff 64bit pref] Jan 14 01:30:33.220520 kernel: pci_bus 0000:1c: resource 0 [io 0x3000-0x3fff] Jan 14 01:30:33.220612 kernel: pci_bus 0000:1c: resource 1 [mem 0x80c00000-0x80dfffff] Jan 14 01:30:33.220702 kernel: pci_bus 0000:1c: resource 2 [mem 0x38d000000000-0x38d7ffffffff 64bit pref] Jan 14 01:30:33.220801 kernel: pci_bus 0000:1d: resource 0 [io 0x2000-0x2fff] Jan 14 01:30:33.220894 kernel: pci_bus 0000:1d: resource 1 [mem 0x80a00000-0x80bfffff] Jan 14 01:30:33.220992 kernel: pci_bus 0000:1d: resource 2 [mem 0x38d800000000-0x38dfffffffff 64bit pref] Jan 14 01:30:33.221092 kernel: pci_bus 0000:1e: resource 0 [io 0x1000-0x1fff] Jan 14 01:30:33.221182 kernel: pci_bus 0000:1e: resource 1 [mem 0x80800000-0x809fffff] Jan 14 01:30:33.221273 kernel: pci_bus 0000:1e: resource 2 [mem 0x38e000000000-0x38e7ffffffff 64bit pref] Jan 14 01:30:33.221285 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Jan 14 01:30:33.221296 kernel: PCI: CLS 0 bytes, default 64 Jan 14 01:30:33.221305 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Jan 14 01:30:33.221314 kernel: software IO TLB: mapped [mem 0x0000000077ede000-0x000000007bede000] (64MB) Jan 14 01:30:33.221322 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Jan 14 01:30:33.221331 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x21134f58f0d, max_idle_ns: 440795217993 ns Jan 14 01:30:33.221339 kernel: Initialise system trusted keyrings Jan 14 01:30:33.221349 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Jan 14 01:30:33.221359 kernel: Key type asymmetric registered Jan 14 01:30:33.221368 kernel: Asymmetric key parser 'x509' registered Jan 14 01:30:33.221376 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Jan 14 01:30:33.221385 kernel: io scheduler mq-deadline registered Jan 14 01:30:33.221393 kernel: io scheduler kyber registered Jan 14 01:30:33.221402 kernel: io scheduler bfq registered Jan 14 01:30:33.221513 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Jan 14 01:30:33.221617 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Jan 14 01:30:33.221718 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Jan 14 01:30:33.221816 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Jan 14 01:30:33.221917 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Jan 14 01:30:33.223349 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Jan 14 01:30:33.223476 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Jan 14 01:30:33.223580 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Jan 14 01:30:33.223687 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Jan 14 01:30:33.223788 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Jan 14 01:30:33.223888 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Jan 14 01:30:33.224008 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Jan 14 01:30:33.224110 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Jan 14 01:30:33.224210 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Jan 14 01:30:33.224309 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Jan 14 01:30:33.224409 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Jan 14 01:30:33.224423 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Jan 14 01:30:33.224524 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 32 Jan 14 01:30:33.224623 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 32 Jan 14 01:30:33.224724 kernel: pcieport 0000:00:03.1: PME: Signaling with IRQ 33 Jan 14 01:30:33.224821 kernel: pcieport 0000:00:03.1: AER: enabled with IRQ 33 Jan 14 01:30:33.224923 kernel: pcieport 0000:00:03.2: PME: Signaling with IRQ 34 Jan 14 01:30:33.226241 kernel: pcieport 0000:00:03.2: AER: enabled with IRQ 34 Jan 14 01:30:33.226360 kernel: pcieport 0000:00:03.3: PME: Signaling with IRQ 35 Jan 14 01:30:33.226459 kernel: pcieport 0000:00:03.3: AER: enabled with IRQ 35 Jan 14 01:30:33.226559 kernel: pcieport 0000:00:03.4: PME: Signaling with IRQ 36 Jan 14 01:30:33.226662 kernel: pcieport 0000:00:03.4: AER: enabled with IRQ 36 Jan 14 01:30:33.226762 kernel: pcieport 0000:00:03.5: PME: Signaling with IRQ 37 Jan 14 01:30:33.226864 kernel: pcieport 0000:00:03.5: AER: enabled with IRQ 37 Jan 14 01:30:33.226964 kernel: pcieport 0000:00:03.6: PME: Signaling with IRQ 38 Jan 14 01:30:33.227076 kernel: pcieport 0000:00:03.6: AER: enabled with IRQ 38 Jan 14 01:30:33.227178 kernel: pcieport 0000:00:03.7: PME: Signaling with IRQ 39 Jan 14 01:30:33.227276 kernel: pcieport 0000:00:03.7: AER: enabled with IRQ 39 Jan 14 01:30:33.227288 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 Jan 14 01:30:33.227386 kernel: pcieport 0000:00:04.0: PME: Signaling with IRQ 40 Jan 14 01:30:33.227483 kernel: pcieport 0000:00:04.0: AER: enabled with IRQ 40 Jan 14 01:30:33.227583 kernel: pcieport 0000:00:04.1: PME: Signaling with IRQ 41 Jan 14 01:30:33.227680 kernel: pcieport 0000:00:04.1: AER: enabled with IRQ 41 Jan 14 01:30:33.227782 kernel: pcieport 0000:00:04.2: PME: Signaling with IRQ 42 Jan 14 01:30:33.227879 kernel: pcieport 0000:00:04.2: AER: enabled with IRQ 42 Jan 14 01:30:33.228333 kernel: pcieport 0000:00:04.3: PME: Signaling with IRQ 43 Jan 14 01:30:33.228450 kernel: pcieport 0000:00:04.3: AER: enabled with IRQ 43 Jan 14 01:30:33.228551 kernel: pcieport 0000:00:04.4: PME: Signaling with IRQ 44 Jan 14 01:30:33.228649 kernel: pcieport 0000:00:04.4: AER: enabled with IRQ 44 Jan 14 01:30:33.228748 kernel: pcieport 0000:00:04.5: PME: Signaling with IRQ 45 Jan 14 01:30:33.228850 kernel: pcieport 0000:00:04.5: AER: enabled with IRQ 45 Jan 14 01:30:33.228949 kernel: pcieport 0000:00:04.6: PME: Signaling with IRQ 46 Jan 14 01:30:33.229064 kernel: pcieport 0000:00:04.6: AER: enabled with IRQ 46 Jan 14 01:30:33.229165 kernel: pcieport 0000:00:04.7: PME: Signaling with IRQ 47 Jan 14 01:30:33.229262 kernel: pcieport 0000:00:04.7: AER: enabled with IRQ 47 Jan 14 01:30:33.229274 kernel: ACPI: \_SB_.GSIF: Enabled at IRQ 21 Jan 14 01:30:33.229374 kernel: pcieport 0000:00:05.0: PME: Signaling with IRQ 48 Jan 14 01:30:33.229471 kernel: pcieport 0000:00:05.0: AER: enabled with IRQ 48 Jan 14 01:30:33.229571 kernel: pcieport 0000:00:05.1: PME: Signaling with IRQ 49 Jan 14 01:30:33.229668 kernel: pcieport 0000:00:05.1: AER: enabled with IRQ 49 Jan 14 01:30:33.229767 kernel: pcieport 0000:00:05.2: PME: Signaling with IRQ 50 Jan 14 01:30:33.229864 kernel: pcieport 0000:00:05.2: AER: enabled with IRQ 50 Jan 14 01:30:33.229963 kernel: pcieport 0000:00:05.3: PME: Signaling with IRQ 51 Jan 14 01:30:33.230075 kernel: pcieport 0000:00:05.3: AER: enabled with IRQ 51 Jan 14 01:30:33.230175 kernel: pcieport 0000:00:05.4: PME: Signaling with IRQ 52 Jan 14 01:30:33.230271 kernel: pcieport 0000:00:05.4: AER: enabled with IRQ 52 Jan 14 01:30:33.230282 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Jan 14 01:30:33.230291 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Jan 14 01:30:33.230315 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Jan 14 01:30:33.230325 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Jan 14 01:30:33.230335 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Jan 14 01:30:33.230344 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Jan 14 01:30:33.230353 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Jan 14 01:30:33.230723 kernel: rtc_cmos 00:03: RTC can wake from S4 Jan 14 01:30:33.230826 kernel: rtc_cmos 00:03: registered as rtc0 Jan 14 01:30:33.230919 kernel: rtc_cmos 00:03: setting system clock to 2026-01-14T01:30:31 UTC (1768354231) Jan 14 01:30:33.231050 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram Jan 14 01:30:33.231062 kernel: intel_pstate: CPU model not supported Jan 14 01:30:33.231071 kernel: efifb: probing for efifb Jan 14 01:30:33.231080 kernel: efifb: framebuffer at 0x80000000, using 4000k, total 4000k Jan 14 01:30:33.231088 kernel: efifb: mode is 1280x800x32, linelength=5120, pages=1 Jan 14 01:30:33.231097 kernel: efifb: scrolling: redraw Jan 14 01:30:33.231106 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Jan 14 01:30:33.231117 kernel: Console: switching to colour frame buffer device 160x50 Jan 14 01:30:33.231126 kernel: fb0: EFI VGA frame buffer device Jan 14 01:30:33.231134 kernel: pstore: Using crash dump compression: deflate Jan 14 01:30:33.231329 kernel: pstore: Registered efi_pstore as persistent store backend Jan 14 01:30:33.231339 kernel: NET: Registered PF_INET6 protocol family Jan 14 01:30:33.231347 kernel: Segment Routing with IPv6 Jan 14 01:30:33.231355 kernel: In-situ OAM (IOAM) with IPv6 Jan 14 01:30:33.231366 kernel: NET: Registered PF_PACKET protocol family Jan 14 01:30:33.231374 kernel: Key type dns_resolver registered Jan 14 01:30:33.231383 kernel: IPI shorthand broadcast: enabled Jan 14 01:30:33.231392 kernel: sched_clock: Marking stable (2569164136, 154191615)->(2825472896, -102117145) Jan 14 01:30:33.231400 kernel: registered taskstats version 1 Jan 14 01:30:33.231409 kernel: Loading compiled-in X.509 certificates Jan 14 01:30:33.231418 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.65-flatcar: e43fcdb17feb86efe6ca4b76910b93467fb95f4f' Jan 14 01:30:33.231428 kernel: Demotion targets for Node 0: null Jan 14 01:30:33.231437 kernel: Key type .fscrypt registered Jan 14 01:30:33.231446 kernel: Key type fscrypt-provisioning registered Jan 14 01:30:33.231454 kernel: ima: No TPM chip found, activating TPM-bypass! Jan 14 01:30:33.231462 kernel: ima: Allocated hash algorithm: sha1 Jan 14 01:30:33.231471 kernel: ima: No architecture policies found Jan 14 01:30:33.231479 kernel: clk: Disabling unused clocks Jan 14 01:30:33.231488 kernel: Freeing unused kernel image (initmem) memory: 15536K Jan 14 01:30:33.231498 kernel: Write protecting the kernel read-only data: 47104k Jan 14 01:30:33.231506 kernel: Freeing unused kernel image (rodata/data gap) memory: 1124K Jan 14 01:30:33.231515 kernel: Run /init as init process Jan 14 01:30:33.231523 kernel: with arguments: Jan 14 01:30:33.231532 kernel: /init Jan 14 01:30:33.231540 kernel: with environment: Jan 14 01:30:33.231549 kernel: HOME=/ Jan 14 01:30:33.231559 kernel: TERM=linux Jan 14 01:30:33.231567 kernel: SCSI subsystem initialized Jan 14 01:30:33.231576 kernel: libata version 3.00 loaded. Jan 14 01:30:33.231694 kernel: ahci 0000:00:1f.2: version 3.0 Jan 14 01:30:33.231707 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Jan 14 01:30:33.232010 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode Jan 14 01:30:33.233076 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) Jan 14 01:30:33.233206 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Jan 14 01:30:33.233329 kernel: scsi host0: ahci Jan 14 01:30:33.233438 kernel: scsi host1: ahci Jan 14 01:30:33.233560 kernel: scsi host2: ahci Jan 14 01:30:33.233664 kernel: scsi host3: ahci Jan 14 01:30:33.233773 kernel: scsi host4: ahci Jan 14 01:30:33.233885 kernel: scsi host5: ahci Jan 14 01:30:33.233898 kernel: ata1: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380100 irq 55 lpm-pol 1 Jan 14 01:30:33.233908 kernel: ata2: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380180 irq 55 lpm-pol 1 Jan 14 01:30:33.233917 kernel: ata3: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380200 irq 55 lpm-pol 1 Jan 14 01:30:33.233926 kernel: ata4: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380280 irq 55 lpm-pol 1 Jan 14 01:30:33.233937 kernel: ata5: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380300 irq 55 lpm-pol 1 Jan 14 01:30:33.233949 kernel: ata6: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380380 irq 55 lpm-pol 1 Jan 14 01:30:33.233958 kernel: ata1: SATA link down (SStatus 0 SControl 300) Jan 14 01:30:33.233970 kernel: ata3: SATA link down (SStatus 0 SControl 300) Jan 14 01:30:33.233979 kernel: ata6: SATA link down (SStatus 0 SControl 300) Jan 14 01:30:33.233996 kernel: ata5: SATA link down (SStatus 0 SControl 300) Jan 14 01:30:33.234005 kernel: ata2: SATA link down (SStatus 0 SControl 300) Jan 14 01:30:33.234015 kernel: ata4: SATA link down (SStatus 0 SControl 300) Jan 14 01:30:33.234024 kernel: ACPI: bus type USB registered Jan 14 01:30:33.234034 kernel: usbcore: registered new interface driver usbfs Jan 14 01:30:33.234043 kernel: usbcore: registered new interface driver hub Jan 14 01:30:33.234052 kernel: usbcore: registered new device driver usb Jan 14 01:30:33.234164 kernel: uhci_hcd 0000:02:01.0: UHCI Host Controller Jan 14 01:30:33.234273 kernel: uhci_hcd 0000:02:01.0: new USB bus registered, assigned bus number 1 Jan 14 01:30:33.234380 kernel: uhci_hcd 0000:02:01.0: detected 2 ports Jan 14 01:30:33.234481 kernel: uhci_hcd 0000:02:01.0: irq 22, io port 0x00006000 Jan 14 01:30:33.234613 kernel: hub 1-0:1.0: USB hub found Jan 14 01:30:33.234723 kernel: hub 1-0:1.0: 2 ports detected Jan 14 01:30:33.234838 kernel: virtio_blk virtio2: 2/0/0 default/read/poll queues Jan 14 01:30:33.234937 kernel: virtio_blk virtio2: [vda] 104857600 512-byte logical blocks (53.7 GB/50.0 GiB) Jan 14 01:30:33.234951 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Jan 14 01:30:33.234962 kernel: GPT:25804799 != 104857599 Jan 14 01:30:33.234971 kernel: GPT:Alternate GPT header not at the end of the disk. Jan 14 01:30:33.234979 kernel: GPT:25804799 != 104857599 Jan 14 01:30:33.234996 kernel: GPT: Use GNU Parted to correct GPT errors. Jan 14 01:30:33.235004 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Jan 14 01:30:33.235015 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Jan 14 01:30:33.235024 kernel: device-mapper: uevent: version 1.0.3 Jan 14 01:30:33.235033 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Jan 14 01:30:33.235042 kernel: device-mapper: verity: sha256 using shash "sha256-generic" Jan 14 01:30:33.235050 kernel: raid6: avx512x4 gen() 26544 MB/s Jan 14 01:30:33.235059 kernel: raid6: avx512x2 gen() 37064 MB/s Jan 14 01:30:33.235068 kernel: raid6: avx512x1 gen() 39999 MB/s Jan 14 01:30:33.235079 kernel: raid6: avx2x4 gen() 31101 MB/s Jan 14 01:30:33.235087 kernel: raid6: avx2x2 gen() 33388 MB/s Jan 14 01:30:33.235096 kernel: raid6: avx2x1 gen() 30553 MB/s Jan 14 01:30:33.235104 kernel: raid6: using algorithm avx512x1 gen() 39999 MB/s Jan 14 01:30:33.235113 kernel: raid6: .... xor() 24752 MB/s, rmw enabled Jan 14 01:30:33.235123 kernel: raid6: using avx512x2 recovery algorithm Jan 14 01:30:33.235253 kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd Jan 14 01:30:33.235269 kernel: xor: automatically using best checksumming function avx Jan 14 01:30:33.235278 kernel: Btrfs loaded, zoned=no, fsverity=no Jan 14 01:30:33.235287 kernel: BTRFS: device fsid cd6116b6-e1b6-44f4-b1e2-5e7c5565b295 devid 1 transid 36 /dev/mapper/usr (253:0) scanned by mount (204) Jan 14 01:30:33.235296 kernel: BTRFS info (device dm-0): first mount of filesystem cd6116b6-e1b6-44f4-b1e2-5e7c5565b295 Jan 14 01:30:33.235305 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Jan 14 01:30:33.235313 kernel: BTRFS info (device dm-0): disabling log replay at mount time Jan 14 01:30:33.235324 kernel: BTRFS info (device dm-0): enabling free space tree Jan 14 01:30:33.235333 kernel: loop: module loaded Jan 14 01:30:33.235342 kernel: loop0: detected capacity change from 0 to 100544 Jan 14 01:30:33.235350 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Jan 14 01:30:33.235359 kernel: hid: raw HID events driver (C) Jiri Kosina Jan 14 01:30:33.235368 kernel: usbcore: registered new interface driver usbhid Jan 14 01:30:33.235377 kernel: usbhid: USB HID core driver Jan 14 01:30:33.236346 systemd[1]: Successfully made /usr/ read-only. Jan 14 01:30:33.236361 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jan 14 01:30:33.236371 systemd[1]: Detected virtualization kvm. Jan 14 01:30:33.236381 systemd[1]: Detected architecture x86-64. Jan 14 01:30:33.236390 systemd[1]: Running in initrd. Jan 14 01:30:33.236399 systemd[1]: No hostname configured, using default hostname. Jan 14 01:30:33.236411 systemd[1]: Hostname set to . Jan 14 01:30:33.236419 systemd[1]: Initializing machine ID from SMBIOS/DMI UUID. Jan 14 01:30:33.236428 systemd[1]: Queued start job for default target initrd.target. Jan 14 01:30:33.236437 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Jan 14 01:30:33.236447 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 14 01:30:33.236456 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 14 01:30:33.236468 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Jan 14 01:30:33.236478 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 14 01:30:33.236487 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Jan 14 01:30:33.236497 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Jan 14 01:30:33.236506 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 14 01:30:33.236515 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 14 01:30:33.236526 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Jan 14 01:30:33.236535 systemd[1]: Reached target paths.target - Path Units. Jan 14 01:30:33.236544 systemd[1]: Reached target slices.target - Slice Units. Jan 14 01:30:33.236553 systemd[1]: Reached target swap.target - Swaps. Jan 14 01:30:33.236562 systemd[1]: Reached target timers.target - Timer Units. Jan 14 01:30:33.236571 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Jan 14 01:30:33.236580 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 14 01:30:33.236590 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Jan 14 01:30:33.236600 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Jan 14 01:30:33.236609 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Jan 14 01:30:33.236619 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 14 01:30:33.236630 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 14 01:30:33.236639 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 14 01:30:33.236648 systemd[1]: Reached target sockets.target - Socket Units. Jan 14 01:30:33.236659 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Jan 14 01:30:33.236668 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Jan 14 01:30:33.236677 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 14 01:30:33.236686 systemd[1]: Finished network-cleanup.service - Network Cleanup. Jan 14 01:30:33.236695 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Jan 14 01:30:33.236705 systemd[1]: Starting systemd-fsck-usr.service... Jan 14 01:30:33.236716 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 14 01:30:33.236725 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 14 01:30:33.236734 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 14 01:30:33.236743 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Jan 14 01:30:33.236754 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 14 01:30:33.236763 systemd[1]: Finished systemd-fsck-usr.service. Jan 14 01:30:33.236772 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 14 01:30:33.236781 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 14 01:30:33.236791 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 14 01:30:33.236827 systemd-journald[342]: Collecting audit messages is enabled. Jan 14 01:30:33.236851 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Jan 14 01:30:33.236861 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 14 01:30:33.236871 kernel: Bridge firewalling registered Jan 14 01:30:33.236882 kernel: audit: type=1130 audit(1768354233.179:2): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:33.236892 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 14 01:30:33.236901 kernel: audit: type=1130 audit(1768354233.185:3): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:33.236910 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 14 01:30:33.236920 kernel: audit: type=1130 audit(1768354233.190:4): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:33.236929 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Jan 14 01:30:33.236939 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 14 01:30:33.236949 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 14 01:30:33.236959 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 14 01:30:33.236968 kernel: audit: type=1130 audit(1768354233.231:5): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:33.236977 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 14 01:30:33.236996 systemd-journald[342]: Journal started Jan 14 01:30:33.237018 systemd-journald[342]: Runtime Journal (/run/log/journal/d33b19f426064484acd553494ef70dcf) is 8M, max 77.9M, 69.9M free. Jan 14 01:30:33.179000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:33.185000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:33.190000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:33.231000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:33.178337 systemd-modules-load[344]: Inserted module 'br_netfilter' Jan 14 01:30:33.243191 systemd[1]: Started systemd-journald.service - Journal Service. Jan 14 01:30:33.243215 kernel: audit: type=1130 audit(1768354233.237:6): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:33.237000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:33.243613 dracut-cmdline[362]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack verity.usrhash=ef461ed71f713584f576c99df12ffb04dd99b33cd2d16edeb307d0cf2f5b4260 Jan 14 01:30:33.252081 kernel: audit: type=1130 audit(1768354233.242:7): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:33.252104 kernel: audit: type=1334 audit(1768354233.246:8): prog-id=6 op=LOAD Jan 14 01:30:33.242000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:33.246000 audit: BPF prog-id=6 op=LOAD Jan 14 01:30:33.251023 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 14 01:30:33.254501 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 14 01:30:33.273475 systemd-tmpfiles[404]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Jan 14 01:30:33.277877 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 14 01:30:33.278000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:33.283002 kernel: audit: type=1130 audit(1768354233.278:9): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:33.318221 systemd-resolved[400]: Positive Trust Anchors: Jan 14 01:30:33.318233 systemd-resolved[400]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 14 01:30:33.318236 systemd-resolved[400]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Jan 14 01:30:33.318267 systemd-resolved[400]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 14 01:30:33.352638 systemd-resolved[400]: Defaulting to hostname 'linux'. Jan 14 01:30:33.355372 kernel: Loading iSCSI transport class v2.0-870. Jan 14 01:30:33.353584 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 14 01:30:33.355000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:33.356607 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 14 01:30:33.361004 kernel: audit: type=1130 audit(1768354233.355:10): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:33.373018 kernel: iscsi: registered transport (tcp) Jan 14 01:30:33.397043 kernel: iscsi: registered transport (qla4xxx) Jan 14 01:30:33.397129 kernel: QLogic iSCSI HBA Driver Jan 14 01:30:33.423506 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 14 01:30:33.442818 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 14 01:30:33.443000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:33.444646 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 14 01:30:33.485451 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Jan 14 01:30:33.485000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:33.488110 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Jan 14 01:30:33.491092 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Jan 14 01:30:33.528620 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Jan 14 01:30:33.528000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:33.529000 audit: BPF prog-id=7 op=LOAD Jan 14 01:30:33.529000 audit: BPF prog-id=8 op=LOAD Jan 14 01:30:33.531120 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 14 01:30:33.560973 systemd-udevd[620]: Using default interface naming scheme 'v257'. Jan 14 01:30:33.571349 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 14 01:30:33.572000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:33.577384 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Jan 14 01:30:33.588118 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 14 01:30:33.587000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:33.588000 audit: BPF prog-id=9 op=LOAD Jan 14 01:30:33.591112 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 14 01:30:33.598767 dracut-pre-trigger[707]: rd.md=0: removing MD RAID activation Jan 14 01:30:33.627564 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Jan 14 01:30:33.629000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:33.631514 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 14 01:30:33.640414 systemd-networkd[719]: lo: Link UP Jan 14 01:30:33.640000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:33.640421 systemd-networkd[719]: lo: Gained carrier Jan 14 01:30:33.641362 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 14 01:30:33.642707 systemd[1]: Reached target network.target - Network. Jan 14 01:30:33.724673 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 14 01:30:33.727000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:33.730552 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Jan 14 01:30:33.838106 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Jan 14 01:30:33.856011 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.0/0000:01:00.0/0000:02:01.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input2 Jan 14 01:30:33.857700 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Jan 14 01:30:33.862000 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:01.0-1/input0 Jan 14 01:30:33.895014 kernel: cryptd: max_cpu_qlen set to 1000 Jan 14 01:30:33.903441 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Jan 14 01:30:33.923257 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Jan 14 01:30:33.932039 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input3 Jan 14 01:30:33.934425 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Jan 14 01:30:33.938839 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 14 01:30:33.940178 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 14 01:30:33.942000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:33.942468 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jan 14 01:30:33.943465 kernel: AES CTR mode by8 optimization enabled Jan 14 01:30:33.968757 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 14 01:30:33.976145 systemd-networkd[719]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 14 01:30:33.976153 systemd-networkd[719]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 14 01:30:33.976465 systemd-networkd[719]: eth0: Link UP Jan 14 01:30:33.976955 systemd-networkd[719]: eth0: Gained carrier Jan 14 01:30:33.976967 systemd-networkd[719]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 14 01:30:33.999282 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Jan 14 01:30:33.999000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:34.000907 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 14 01:30:34.000000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:34.002666 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Jan 14 01:30:34.003741 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 14 01:30:34.005011 disk-uuid[887]: Primary Header is updated. Jan 14 01:30:34.005011 disk-uuid[887]: Secondary Entries is updated. Jan 14 01:30:34.005011 disk-uuid[887]: Secondary Header is updated. Jan 14 01:30:34.006227 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 14 01:30:34.008788 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Jan 14 01:30:34.028432 systemd-networkd[719]: eth0: DHCPv4 address 10.0.22.183/25, gateway 10.0.22.129 acquired from 10.0.22.129 Jan 14 01:30:34.058698 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Jan 14 01:30:34.067161 kernel: kauditd_printk_skb: 14 callbacks suppressed Jan 14 01:30:34.067211 kernel: audit: type=1130 audit(1768354234.061:25): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:34.061000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:35.072564 disk-uuid[891]: Warning: The kernel is still using the old partition table. Jan 14 01:30:35.072564 disk-uuid[891]: The new table will be used at the next reboot or after you Jan 14 01:30:35.072564 disk-uuid[891]: run partprobe(8) or kpartx(8) Jan 14 01:30:35.072564 disk-uuid[891]: The operation has completed successfully. Jan 14 01:30:35.087212 systemd[1]: disk-uuid.service: Deactivated successfully. Jan 14 01:30:35.099382 kernel: audit: type=1130 audit(1768354235.088:26): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:35.099429 kernel: audit: type=1131 audit(1768354235.088:27): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:35.088000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:35.088000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:35.087494 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Jan 14 01:30:35.093228 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Jan 14 01:30:35.147018 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (910) Jan 14 01:30:35.150914 kernel: BTRFS info (device vda6): first mount of filesystem 37f804f9-71c0-44d1-975c-4a397de322e7 Jan 14 01:30:35.150963 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Jan 14 01:30:35.158534 kernel: BTRFS info (device vda6): turning on async discard Jan 14 01:30:35.158581 kernel: BTRFS info (device vda6): enabling free space tree Jan 14 01:30:35.165017 kernel: BTRFS info (device vda6): last unmount of filesystem 37f804f9-71c0-44d1-975c-4a397de322e7 Jan 14 01:30:35.165945 systemd[1]: Finished ignition-setup.service - Ignition (setup). Jan 14 01:30:35.165000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:35.171013 kernel: audit: type=1130 audit(1768354235.165:28): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:35.171122 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Jan 14 01:30:35.319934 systemd-networkd[719]: eth0: Gained IPv6LL Jan 14 01:30:35.435084 ignition[929]: Ignition 2.24.0 Jan 14 01:30:35.435096 ignition[929]: Stage: fetch-offline Jan 14 01:30:35.435511 ignition[929]: no configs at "/usr/lib/ignition/base.d" Jan 14 01:30:35.436837 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Jan 14 01:30:35.436000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:35.435526 ignition[929]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 14 01:30:35.442010 kernel: audit: type=1130 audit(1768354235.436:29): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:35.438860 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Jan 14 01:30:35.435614 ignition[929]: parsed url from cmdline: "" Jan 14 01:30:35.435617 ignition[929]: no config URL provided Jan 14 01:30:35.435622 ignition[929]: reading system config file "/usr/lib/ignition/user.ign" Jan 14 01:30:35.435630 ignition[929]: no config at "/usr/lib/ignition/user.ign" Jan 14 01:30:35.435635 ignition[929]: failed to fetch config: resource requires networking Jan 14 01:30:35.435854 ignition[929]: Ignition finished successfully Jan 14 01:30:35.464905 ignition[936]: Ignition 2.24.0 Jan 14 01:30:35.464918 ignition[936]: Stage: fetch Jan 14 01:30:35.465438 ignition[936]: no configs at "/usr/lib/ignition/base.d" Jan 14 01:30:35.465447 ignition[936]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 14 01:30:35.465537 ignition[936]: parsed url from cmdline: "" Jan 14 01:30:35.465540 ignition[936]: no config URL provided Jan 14 01:30:35.465548 ignition[936]: reading system config file "/usr/lib/ignition/user.ign" Jan 14 01:30:35.465554 ignition[936]: no config at "/usr/lib/ignition/user.ign" Jan 14 01:30:35.465635 ignition[936]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Jan 14 01:30:35.465768 ignition[936]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Jan 14 01:30:35.465789 ignition[936]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Jan 14 01:30:35.971962 ignition[936]: GET result: OK Jan 14 01:30:35.972822 ignition[936]: parsing config with SHA512: e0268f2f1f29f0729f5585914ad98be6b44b2f54fe5330ec941ba342edbd42ca384c2ee0c3264736164282bfa1bcd658ccbca87a72bb6b3db578212e89f5f3e0 Jan 14 01:30:35.978461 unknown[936]: fetched base config from "system" Jan 14 01:30:35.978471 unknown[936]: fetched base config from "system" Jan 14 01:30:35.978806 ignition[936]: fetch: fetch complete Jan 14 01:30:35.978477 unknown[936]: fetched user config from "openstack" Jan 14 01:30:35.980000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:35.978811 ignition[936]: fetch: fetch passed Jan 14 01:30:35.981237 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Jan 14 01:30:35.986224 kernel: audit: type=1130 audit(1768354235.980:30): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:35.978850 ignition[936]: Ignition finished successfully Jan 14 01:30:35.984102 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Jan 14 01:30:36.009378 ignition[942]: Ignition 2.24.0 Jan 14 01:30:36.009389 ignition[942]: Stage: kargs Jan 14 01:30:36.009550 ignition[942]: no configs at "/usr/lib/ignition/base.d" Jan 14 01:30:36.009558 ignition[942]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 14 01:30:36.011357 ignition[942]: kargs: kargs passed Jan 14 01:30:36.011403 ignition[942]: Ignition finished successfully Jan 14 01:30:36.013197 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Jan 14 01:30:36.017009 kernel: audit: type=1130 audit(1768354236.012:31): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:36.012000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:36.014531 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Jan 14 01:30:36.038532 ignition[949]: Ignition 2.24.0 Jan 14 01:30:36.038543 ignition[949]: Stage: disks Jan 14 01:30:36.038676 ignition[949]: no configs at "/usr/lib/ignition/base.d" Jan 14 01:30:36.038684 ignition[949]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 14 01:30:36.040885 ignition[949]: disks: disks passed Jan 14 01:30:36.040929 ignition[949]: Ignition finished successfully Jan 14 01:30:36.043215 systemd[1]: Finished ignition-disks.service - Ignition (disks). Jan 14 01:30:36.042000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:36.044169 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Jan 14 01:30:36.047314 kernel: audit: type=1130 audit(1768354236.042:32): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:36.047728 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Jan 14 01:30:36.048102 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 14 01:30:36.048425 systemd[1]: Reached target sysinit.target - System Initialization. Jan 14 01:30:36.048737 systemd[1]: Reached target basic.target - Basic System. Jan 14 01:30:36.051320 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Jan 14 01:30:36.105981 systemd-fsck[957]: ROOT: clean, 15/1631200 files, 112378/1617920 blocks Jan 14 01:30:36.109628 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Jan 14 01:30:36.109000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:36.112088 systemd[1]: Mounting sysroot.mount - /sysroot... Jan 14 01:30:36.115539 kernel: audit: type=1130 audit(1768354236.109:33): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:36.300368 kernel: EXT4-fs (vda9): mounted filesystem 9c98b0a3-27fc-41c4-a169-349b38bd9ceb r/w with ordered data mode. Quota mode: none. Jan 14 01:30:36.301354 systemd[1]: Mounted sysroot.mount - /sysroot. Jan 14 01:30:36.302973 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Jan 14 01:30:36.309459 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 14 01:30:36.311927 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Jan 14 01:30:36.314472 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Jan 14 01:30:36.317198 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Jan 14 01:30:36.320133 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Jan 14 01:30:36.320183 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Jan 14 01:30:36.330906 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Jan 14 01:30:36.334172 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Jan 14 01:30:36.370031 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (965) Jan 14 01:30:36.374292 kernel: BTRFS info (device vda6): first mount of filesystem 37f804f9-71c0-44d1-975c-4a397de322e7 Jan 14 01:30:36.374350 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Jan 14 01:30:36.398532 kernel: BTRFS info (device vda6): turning on async discard Jan 14 01:30:36.398622 kernel: BTRFS info (device vda6): enabling free space tree Jan 14 01:30:36.401512 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 14 01:30:36.436016 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 14 01:30:36.552383 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Jan 14 01:30:36.552000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:36.554663 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Jan 14 01:30:36.557540 kernel: audit: type=1130 audit(1768354236.552:34): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:36.560123 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Jan 14 01:30:36.573072 systemd[1]: sysroot-oem.mount: Deactivated successfully. Jan 14 01:30:36.576028 kernel: BTRFS info (device vda6): last unmount of filesystem 37f804f9-71c0-44d1-975c-4a397de322e7 Jan 14 01:30:36.599547 ignition[1066]: INFO : Ignition 2.24.0 Jan 14 01:30:36.599547 ignition[1066]: INFO : Stage: mount Jan 14 01:30:36.599547 ignition[1066]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 14 01:30:36.599547 ignition[1066]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 14 01:30:36.602663 ignition[1066]: INFO : mount: mount passed Jan 14 01:30:36.602663 ignition[1066]: INFO : Ignition finished successfully Jan 14 01:30:36.603231 systemd[1]: Finished ignition-mount.service - Ignition (mount). Jan 14 01:30:36.603000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:36.607624 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Jan 14 01:30:36.607000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:37.475036 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 14 01:30:39.488005 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 14 01:30:43.495445 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 14 01:30:43.505347 coreos-metadata[967]: Jan 14 01:30:43.505 WARN failed to locate config-drive, using the metadata service API instead Jan 14 01:30:43.523655 coreos-metadata[967]: Jan 14 01:30:43.523 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Jan 14 01:30:43.687030 coreos-metadata[967]: Jan 14 01:30:43.686 INFO Fetch successful Jan 14 01:30:43.687030 coreos-metadata[967]: Jan 14 01:30:43.686 INFO wrote hostname ci-4578-0-0-p-557efd55ff to /sysroot/etc/hostname Jan 14 01:30:43.691710 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Jan 14 01:30:43.720113 kernel: kauditd_printk_skb: 2 callbacks suppressed Jan 14 01:30:43.720207 kernel: audit: type=1130 audit(1768354243.691:37): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:43.720242 kernel: audit: type=1131 audit(1768354243.691:38): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:43.691000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:43.691000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:43.691957 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Jan 14 01:30:43.697141 systemd[1]: Starting ignition-files.service - Ignition (files)... Jan 14 01:30:43.744173 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 14 01:30:43.792011 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (1082) Jan 14 01:30:43.798019 kernel: BTRFS info (device vda6): first mount of filesystem 37f804f9-71c0-44d1-975c-4a397de322e7 Jan 14 01:30:43.798085 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Jan 14 01:30:43.811401 kernel: BTRFS info (device vda6): turning on async discard Jan 14 01:30:43.811477 kernel: BTRFS info (device vda6): enabling free space tree Jan 14 01:30:43.813695 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 14 01:30:43.846013 ignition[1099]: INFO : Ignition 2.24.0 Jan 14 01:30:43.846013 ignition[1099]: INFO : Stage: files Jan 14 01:30:43.847254 ignition[1099]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 14 01:30:43.847254 ignition[1099]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 14 01:30:43.847254 ignition[1099]: DEBUG : files: compiled without relabeling support, skipping Jan 14 01:30:43.848453 ignition[1099]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Jan 14 01:30:43.848453 ignition[1099]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Jan 14 01:30:43.854386 ignition[1099]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Jan 14 01:30:43.854898 ignition[1099]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Jan 14 01:30:43.857257 unknown[1099]: wrote ssh authorized keys file for user: core Jan 14 01:30:43.857906 ignition[1099]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Jan 14 01:30:43.860104 ignition[1099]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Jan 14 01:30:43.861178 ignition[1099]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-amd64.tar.gz: attempt #1 Jan 14 01:30:44.972627 ignition[1099]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Jan 14 01:30:45.085483 ignition[1099]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Jan 14 01:30:45.086660 ignition[1099]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Jan 14 01:30:45.086660 ignition[1099]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Jan 14 01:30:45.086660 ignition[1099]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Jan 14 01:30:45.086660 ignition[1099]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Jan 14 01:30:45.086660 ignition[1099]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 14 01:30:45.086660 ignition[1099]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 14 01:30:45.086660 ignition[1099]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 14 01:30:45.086660 ignition[1099]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 14 01:30:45.089741 ignition[1099]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Jan 14 01:30:45.089741 ignition[1099]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Jan 14 01:30:45.089741 ignition[1099]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Jan 14 01:30:45.089741 ignition[1099]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Jan 14 01:30:45.091815 ignition[1099]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Jan 14 01:30:45.091815 ignition[1099]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.32.4-x86-64.raw: attempt #1 Jan 14 01:30:45.366787 ignition[1099]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Jan 14 01:30:45.958832 ignition[1099]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Jan 14 01:30:45.958832 ignition[1099]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Jan 14 01:30:45.962538 ignition[1099]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 14 01:30:45.966534 ignition[1099]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 14 01:30:45.966534 ignition[1099]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Jan 14 01:30:45.966534 ignition[1099]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Jan 14 01:30:45.972401 kernel: audit: type=1130 audit(1768354245.967:39): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:45.967000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:45.972491 ignition[1099]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Jan 14 01:30:45.972491 ignition[1099]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Jan 14 01:30:45.972491 ignition[1099]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Jan 14 01:30:45.972491 ignition[1099]: INFO : files: files passed Jan 14 01:30:45.972491 ignition[1099]: INFO : Ignition finished successfully Jan 14 01:30:45.968286 systemd[1]: Finished ignition-files.service - Ignition (files). Jan 14 01:30:45.971644 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Jan 14 01:30:45.975104 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Jan 14 01:30:45.993622 systemd[1]: ignition-quench.service: Deactivated successfully. Jan 14 01:30:45.994366 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Jan 14 01:30:45.995000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:46.000064 kernel: audit: type=1130 audit(1768354245.995:40): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:45.995000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:46.004044 kernel: audit: type=1131 audit(1768354245.995:41): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:46.005913 initrd-setup-root-after-ignition[1131]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 14 01:30:46.005913 initrd-setup-root-after-ignition[1131]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Jan 14 01:30:46.008847 initrd-setup-root-after-ignition[1135]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 14 01:30:46.010490 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 14 01:30:46.010000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:46.011650 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Jan 14 01:30:46.016110 kernel: audit: type=1130 audit(1768354246.010:42): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:46.017307 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Jan 14 01:30:46.057622 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Jan 14 01:30:46.057730 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Jan 14 01:30:46.067266 kernel: audit: type=1130 audit(1768354246.058:43): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:46.067297 kernel: audit: type=1131 audit(1768354246.058:44): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:46.058000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:46.058000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:46.059625 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Jan 14 01:30:46.067803 systemd[1]: Reached target initrd.target - Initrd Default Target. Jan 14 01:30:46.069122 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Jan 14 01:30:46.070057 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Jan 14 01:30:46.094856 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 14 01:30:46.095000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:46.099001 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Jan 14 01:30:46.102671 kernel: audit: type=1130 audit(1768354246.095:45): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:46.119871 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Jan 14 01:30:46.120029 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Jan 14 01:30:46.121339 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 14 01:30:46.122589 systemd[1]: Stopped target timers.target - Timer Units. Jan 14 01:30:46.123927 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Jan 14 01:30:46.129389 kernel: audit: type=1131 audit(1768354246.124:46): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:46.124000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:46.124073 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 14 01:30:46.129507 systemd[1]: Stopped target initrd.target - Initrd Default Target. Jan 14 01:30:46.130740 systemd[1]: Stopped target basic.target - Basic System. Jan 14 01:30:46.131961 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Jan 14 01:30:46.133206 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Jan 14 01:30:46.134450 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Jan 14 01:30:46.135782 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Jan 14 01:30:46.137010 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Jan 14 01:30:46.138118 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Jan 14 01:30:46.139289 systemd[1]: Stopped target sysinit.target - System Initialization. Jan 14 01:30:46.140450 systemd[1]: Stopped target local-fs.target - Local File Systems. Jan 14 01:30:46.141568 systemd[1]: Stopped target swap.target - Swaps. Jan 14 01:30:46.142693 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Jan 14 01:30:46.142000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:46.142827 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Jan 14 01:30:46.144439 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Jan 14 01:30:46.145541 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 14 01:30:46.146570 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Jan 14 01:30:46.146669 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 14 01:30:46.147000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:46.147704 systemd[1]: dracut-initqueue.service: Deactivated successfully. Jan 14 01:30:46.147820 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Jan 14 01:30:46.149000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:46.149764 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Jan 14 01:30:46.149881 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 14 01:30:46.150000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:46.150914 systemd[1]: ignition-files.service: Deactivated successfully. Jan 14 01:30:46.151031 systemd[1]: Stopped ignition-files.service - Ignition (files). Jan 14 01:30:46.152861 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Jan 14 01:30:46.155768 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Jan 14 01:30:46.157133 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Jan 14 01:30:46.158313 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 14 01:30:46.158000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:46.159805 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Jan 14 01:30:46.160357 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Jan 14 01:30:46.160000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:46.161327 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Jan 14 01:30:46.161850 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Jan 14 01:30:46.162000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:46.166502 systemd[1]: initrd-cleanup.service: Deactivated successfully. Jan 14 01:30:46.167067 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Jan 14 01:30:46.166000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:46.166000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:46.182185 ignition[1155]: INFO : Ignition 2.24.0 Jan 14 01:30:46.182185 ignition[1155]: INFO : Stage: umount Jan 14 01:30:46.184257 ignition[1155]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 14 01:30:46.184257 ignition[1155]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 14 01:30:46.184257 ignition[1155]: INFO : umount: umount passed Jan 14 01:30:46.184257 ignition[1155]: INFO : Ignition finished successfully Jan 14 01:30:46.185000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:46.185011 systemd[1]: ignition-mount.service: Deactivated successfully. Jan 14 01:30:46.187000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:46.185128 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Jan 14 01:30:46.188000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:46.187113 systemd[1]: ignition-disks.service: Deactivated successfully. Jan 14 01:30:46.189000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:46.187183 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Jan 14 01:30:46.188356 systemd[1]: ignition-kargs.service: Deactivated successfully. Jan 14 01:30:46.191000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:46.188398 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Jan 14 01:30:46.189497 systemd[1]: ignition-fetch.service: Deactivated successfully. Jan 14 01:30:46.189544 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Jan 14 01:30:46.190669 systemd[1]: Stopped target network.target - Network. Jan 14 01:30:46.191663 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Jan 14 01:30:46.191704 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Jan 14 01:30:46.192722 systemd[1]: Stopped target paths.target - Path Units. Jan 14 01:30:46.194068 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Jan 14 01:30:46.198077 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 14 01:30:46.198580 systemd[1]: Stopped target slices.target - Slice Units. Jan 14 01:30:46.199505 systemd[1]: Stopped target sockets.target - Socket Units. Jan 14 01:30:46.200523 systemd[1]: iscsid.socket: Deactivated successfully. Jan 14 01:30:46.200563 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Jan 14 01:30:46.201473 systemd[1]: iscsiuio.socket: Deactivated successfully. Jan 14 01:30:46.201505 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 14 01:30:46.203000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:46.202394 systemd[1]: systemd-journald-audit.socket: Deactivated successfully. Jan 14 01:30:46.204000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup-pre comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:46.202419 systemd[1]: Closed systemd-journald-audit.socket - Journal Audit Socket. Jan 14 01:30:46.203310 systemd[1]: ignition-setup.service: Deactivated successfully. Jan 14 01:30:46.203360 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Jan 14 01:30:46.204317 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Jan 14 01:30:46.204358 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Jan 14 01:30:46.205327 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Jan 14 01:30:46.206221 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Jan 14 01:30:46.210579 systemd[1]: sysroot-boot.mount: Deactivated successfully. Jan 14 01:30:46.211513 systemd[1]: systemd-resolved.service: Deactivated successfully. Jan 14 01:30:46.211609 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Jan 14 01:30:46.211000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:46.213412 systemd[1]: sysroot-boot.service: Deactivated successfully. Jan 14 01:30:46.213494 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Jan 14 01:30:46.213000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:46.215000 audit: BPF prog-id=6 op=UNLOAD Jan 14 01:30:46.216444 systemd[1]: systemd-networkd.service: Deactivated successfully. Jan 14 01:30:46.216000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:46.217147 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Jan 14 01:30:46.219776 systemd[1]: Stopped target network-pre.target - Preparation for Network. Jan 14 01:30:46.220327 systemd[1]: systemd-networkd.socket: Deactivated successfully. Jan 14 01:30:46.220000 audit: BPF prog-id=9 op=UNLOAD Jan 14 01:30:46.220374 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Jan 14 01:30:46.221000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:46.221329 systemd[1]: initrd-setup-root.service: Deactivated successfully. Jan 14 01:30:46.221380 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Jan 14 01:30:46.223174 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Jan 14 01:30:46.225070 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Jan 14 01:30:46.225127 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 14 01:30:46.226000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:46.227123 systemd[1]: systemd-sysctl.service: Deactivated successfully. Jan 14 01:30:46.227181 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Jan 14 01:30:46.227000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:46.228076 systemd[1]: systemd-modules-load.service: Deactivated successfully. Jan 14 01:30:46.228000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:46.228116 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Jan 14 01:30:46.229226 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 14 01:30:46.242462 systemd[1]: systemd-udevd.service: Deactivated successfully. Jan 14 01:30:46.242624 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 14 01:30:46.243000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:46.245028 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Jan 14 01:30:46.245117 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Jan 14 01:30:46.247566 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Jan 14 01:30:46.247599 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Jan 14 01:30:46.248000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:46.248677 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Jan 14 01:30:46.248722 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Jan 14 01:30:46.250000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:46.250245 systemd[1]: dracut-cmdline.service: Deactivated successfully. Jan 14 01:30:46.251000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:46.250296 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Jan 14 01:30:46.251947 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 14 01:30:46.252040 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 14 01:30:46.261175 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Jan 14 01:30:46.262397 systemd[1]: systemd-network-generator.service: Deactivated successfully. Jan 14 01:30:46.262833 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Jan 14 01:30:46.262000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:46.263865 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Jan 14 01:30:46.264345 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 14 01:30:46.264000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:46.265231 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Jan 14 01:30:46.265669 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 14 01:30:46.265000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:46.266539 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Jan 14 01:30:46.267002 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Jan 14 01:30:46.266000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:46.267870 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 14 01:30:46.268350 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 14 01:30:46.268000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:46.269944 systemd[1]: network-cleanup.service: Deactivated successfully. Jan 14 01:30:46.270516 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Jan 14 01:30:46.270000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=network-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:46.271576 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Jan 14 01:30:46.272127 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Jan 14 01:30:46.271000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:46.271000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:46.273645 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Jan 14 01:30:46.275145 systemd[1]: Starting initrd-switch-root.service - Switch Root... Jan 14 01:30:46.309034 systemd[1]: Switching root. Jan 14 01:30:46.342717 systemd-journald[342]: Journal stopped Jan 14 01:30:48.262248 systemd-journald[342]: Received SIGTERM from PID 1 (systemd). Jan 14 01:30:48.262965 kernel: SELinux: policy capability network_peer_controls=1 Jan 14 01:30:48.262995 kernel: SELinux: policy capability open_perms=1 Jan 14 01:30:48.263012 kernel: SELinux: policy capability extended_socket_class=1 Jan 14 01:30:48.263023 kernel: SELinux: policy capability always_check_network=0 Jan 14 01:30:48.263037 kernel: SELinux: policy capability cgroup_seclabel=1 Jan 14 01:30:48.263052 kernel: SELinux: policy capability nnp_nosuid_transition=1 Jan 14 01:30:48.263066 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Jan 14 01:30:48.263077 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Jan 14 01:30:48.263088 kernel: SELinux: policy capability userspace_initial_context=0 Jan 14 01:30:48.263102 systemd[1]: Successfully loaded SELinux policy in 63.845ms. Jan 14 01:30:48.263119 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 6.433ms. Jan 14 01:30:48.263133 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jan 14 01:30:48.263145 systemd[1]: Detected virtualization kvm. Jan 14 01:30:48.263156 systemd[1]: Detected architecture x86-64. Jan 14 01:30:48.263167 systemd[1]: Detected first boot. Jan 14 01:30:48.263180 systemd[1]: Hostname set to . Jan 14 01:30:48.263193 systemd[1]: Initializing machine ID from SMBIOS/DMI UUID. Jan 14 01:30:48.263204 zram_generator::config[1198]: No configuration found. Jan 14 01:30:48.263217 kernel: Guest personality initialized and is inactive Jan 14 01:30:48.263229 kernel: VMCI host device registered (name=vmci, major=10, minor=258) Jan 14 01:30:48.263241 kernel: Initialized host personality Jan 14 01:30:48.263254 kernel: NET: Registered PF_VSOCK protocol family Jan 14 01:30:48.263265 systemd[1]: Populated /etc with preset unit settings. Jan 14 01:30:48.263281 systemd[1]: initrd-switch-root.service: Deactivated successfully. Jan 14 01:30:48.263304 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Jan 14 01:30:48.263315 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Jan 14 01:30:48.263337 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Jan 14 01:30:48.263349 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Jan 14 01:30:48.263363 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Jan 14 01:30:48.263375 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Jan 14 01:30:48.263389 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Jan 14 01:30:48.263401 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Jan 14 01:30:48.263412 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Jan 14 01:30:48.263423 systemd[1]: Created slice user.slice - User and Session Slice. Jan 14 01:30:48.263434 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 14 01:30:48.263445 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 14 01:30:48.263457 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Jan 14 01:30:48.263472 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Jan 14 01:30:48.263483 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Jan 14 01:30:48.263495 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 14 01:30:48.263506 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Jan 14 01:30:48.263520 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 14 01:30:48.263531 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 14 01:30:48.263542 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Jan 14 01:30:48.263554 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Jan 14 01:30:48.263564 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Jan 14 01:30:48.263576 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Jan 14 01:30:48.263588 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 14 01:30:48.263600 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 14 01:30:48.263611 systemd[1]: Reached target remote-veritysetup.target - Remote Verity Protected Volumes. Jan 14 01:30:48.263623 systemd[1]: Reached target slices.target - Slice Units. Jan 14 01:30:48.263634 systemd[1]: Reached target swap.target - Swaps. Jan 14 01:30:48.263645 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Jan 14 01:30:48.263656 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Jan 14 01:30:48.263667 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Jan 14 01:30:48.263678 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Jan 14 01:30:48.263694 systemd[1]: Listening on systemd-mountfsd.socket - DDI File System Mounter Socket. Jan 14 01:30:48.263705 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 14 01:30:48.263725 systemd[1]: Listening on systemd-nsresourced.socket - Namespace Resource Manager Socket. Jan 14 01:30:48.263737 systemd[1]: Listening on systemd-oomd.socket - Userspace Out-Of-Memory (OOM) Killer Socket. Jan 14 01:30:48.263749 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 14 01:30:48.263760 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 14 01:30:48.263771 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Jan 14 01:30:48.263784 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Jan 14 01:30:48.263797 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Jan 14 01:30:48.263808 systemd[1]: Mounting media.mount - External Media Directory... Jan 14 01:30:48.263820 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 14 01:30:48.263834 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Jan 14 01:30:48.263845 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Jan 14 01:30:48.263856 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Jan 14 01:30:48.263869 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Jan 14 01:30:48.263880 systemd[1]: Reached target machines.target - Containers. Jan 14 01:30:48.263892 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Jan 14 01:30:48.263903 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 14 01:30:48.263916 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 14 01:30:48.263928 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Jan 14 01:30:48.263940 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 14 01:30:48.263952 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 14 01:30:48.263964 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 14 01:30:48.263976 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Jan 14 01:30:48.265095 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 14 01:30:48.265115 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Jan 14 01:30:48.265128 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Jan 14 01:30:48.265140 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Jan 14 01:30:48.265152 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Jan 14 01:30:48.265164 systemd[1]: Stopped systemd-fsck-usr.service. Jan 14 01:30:48.265176 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 14 01:30:48.265190 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 14 01:30:48.265201 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 14 01:30:48.265213 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 14 01:30:48.265225 kernel: fuse: init (API version 7.41) Jan 14 01:30:48.265237 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Jan 14 01:30:48.265248 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Jan 14 01:30:48.265260 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 14 01:30:48.265273 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 14 01:30:48.265284 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Jan 14 01:30:48.265295 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Jan 14 01:30:48.265307 systemd[1]: Mounted media.mount - External Media Directory. Jan 14 01:30:48.265321 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Jan 14 01:30:48.265332 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Jan 14 01:30:48.265343 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Jan 14 01:30:48.265355 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 14 01:30:48.265366 systemd[1]: modprobe@configfs.service: Deactivated successfully. Jan 14 01:30:48.265377 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Jan 14 01:30:48.265389 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 14 01:30:48.265401 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 14 01:30:48.265413 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 14 01:30:48.265424 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 14 01:30:48.265460 systemd-journald[1275]: Collecting audit messages is enabled. Jan 14 01:30:48.265481 systemd[1]: modprobe@fuse.service: Deactivated successfully. Jan 14 01:30:48.265493 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Jan 14 01:30:48.265506 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 14 01:30:48.265518 systemd-journald[1275]: Journal started Jan 14 01:30:48.265540 systemd-journald[1275]: Runtime Journal (/run/log/journal/d33b19f426064484acd553494ef70dcf) is 8M, max 77.9M, 69.9M free. Jan 14 01:30:48.153000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:48.156000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:48.161000 audit: BPF prog-id=14 op=UNLOAD Jan 14 01:30:48.161000 audit: BPF prog-id=13 op=UNLOAD Jan 14 01:30:48.166000 audit: BPF prog-id=15 op=LOAD Jan 14 01:30:48.166000 audit: BPF prog-id=16 op=LOAD Jan 14 01:30:48.166000 audit: BPF prog-id=17 op=LOAD Jan 14 01:30:48.239000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:48.244000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:48.244000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:48.250000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:48.250000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:48.257000 audit: CONFIG_CHANGE op=set audit_enabled=1 old=1 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 res=1 Jan 14 01:30:48.257000 audit[1275]: SYSCALL arch=c000003e syscall=46 success=yes exit=60 a0=6 a1=7ffefc545d40 a2=4000 a3=0 items=0 ppid=1 pid=1275 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="systemd-journal" exe="/usr/lib/systemd/systemd-journald" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:30:48.257000 audit: PROCTITLE proctitle="/usr/lib/systemd/systemd-journald" Jan 14 01:30:48.258000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:48.258000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:48.264000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:48.264000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:47.948660 systemd[1]: Queued start job for default target multi-user.target. Jan 14 01:30:47.974130 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Jan 14 01:30:47.974616 systemd[1]: systemd-journald.service: Deactivated successfully. Jan 14 01:30:48.267191 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 14 01:30:48.271041 kernel: ACPI: bus type drm_connector registered Jan 14 01:30:48.270000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:48.270000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:48.279652 systemd[1]: Started systemd-journald.service - Journal Service. Jan 14 01:30:48.272000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:48.275000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:48.276000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:48.276129 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 14 01:30:48.276919 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 14 01:30:48.279489 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 14 01:30:48.281249 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 14 01:30:48.280000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:48.280000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:48.282520 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Jan 14 01:30:48.283000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-remount-fs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:48.298119 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Jan 14 01:30:48.298000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-load-credentials comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:48.299331 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 14 01:30:48.302129 systemd[1]: Listening on systemd-importd.socket - Disk Image Download Service Socket. Jan 14 01:30:48.302714 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Jan 14 01:30:48.302799 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 14 01:30:48.305097 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Jan 14 01:30:48.305748 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 14 01:30:48.305894 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 14 01:30:48.309150 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Jan 14 01:30:48.311087 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Jan 14 01:30:48.312065 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 14 01:30:48.313141 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Jan 14 01:30:48.314064 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 14 01:30:48.315217 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 14 01:30:48.321590 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Jan 14 01:30:48.328841 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 14 01:30:48.330068 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Jan 14 01:30:48.329000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=flatcar-tmpfiles comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:48.343843 systemd-journald[1275]: Time spent on flushing to /var/log/journal/d33b19f426064484acd553494ef70dcf is 64.407ms for 1835 entries. Jan 14 01:30:48.343843 systemd-journald[1275]: System Journal (/var/log/journal/d33b19f426064484acd553494ef70dcf) is 8M, max 588.1M, 580.1M free. Jan 14 01:30:48.434118 systemd-journald[1275]: Received client request to flush runtime journal. Jan 14 01:30:48.434194 kernel: loop1: detected capacity change from 0 to 50784 Jan 14 01:30:48.434840 kernel: loop2: detected capacity change from 0 to 224512 Jan 14 01:30:48.355000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-random-seed comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:48.386000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:48.398000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:48.404000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:48.429000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-machine-id-commit comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:48.354541 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Jan 14 01:30:48.356321 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Jan 14 01:30:48.358661 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Jan 14 01:30:48.385378 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 14 01:30:48.391643 systemd-tmpfiles[1322]: ACLs are not supported, ignoring. Jan 14 01:30:48.391654 systemd-tmpfiles[1322]: ACLs are not supported, ignoring. Jan 14 01:30:48.438000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-flush comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:48.398079 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 14 01:30:48.403179 systemd[1]: Starting systemd-sysusers.service - Create System Users... Jan 14 01:30:48.405315 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 14 01:30:48.429704 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Jan 14 01:30:48.439053 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Jan 14 01:30:48.471509 systemd[1]: Finished systemd-sysusers.service - Create System Users. Jan 14 01:30:48.471000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysusers comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:48.472000 audit: BPF prog-id=18 op=LOAD Jan 14 01:30:48.472000 audit: BPF prog-id=19 op=LOAD Jan 14 01:30:48.472000 audit: BPF prog-id=20 op=LOAD Jan 14 01:30:48.476000 audit: BPF prog-id=21 op=LOAD Jan 14 01:30:48.476247 systemd[1]: Starting systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer... Jan 14 01:30:48.479683 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 14 01:30:48.482158 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 14 01:30:48.493000 audit: BPF prog-id=22 op=LOAD Jan 14 01:30:48.493000 audit: BPF prog-id=23 op=LOAD Jan 14 01:30:48.493000 audit: BPF prog-id=24 op=LOAD Jan 14 01:30:48.498000 audit: BPF prog-id=25 op=LOAD Jan 14 01:30:48.498000 audit: BPF prog-id=26 op=LOAD Jan 14 01:30:48.498000 audit: BPF prog-id=27 op=LOAD Jan 14 01:30:48.497116 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Jan 14 01:30:48.503057 kernel: loop3: detected capacity change from 0 to 1656 Jan 14 01:30:48.502177 systemd[1]: Starting systemd-nsresourced.service - Namespace Resource Manager... Jan 14 01:30:48.515099 systemd-tmpfiles[1344]: ACLs are not supported, ignoring. Jan 14 01:30:48.515115 systemd-tmpfiles[1344]: ACLs are not supported, ignoring. Jan 14 01:30:48.520109 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 14 01:30:48.519000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:48.537007 kernel: loop4: detected capacity change from 0 to 111560 Jan 14 01:30:48.562887 systemd[1]: Started systemd-userdbd.service - User Database Manager. Jan 14 01:30:48.562000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-userdbd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:48.570058 systemd-nsresourced[1347]: Not setting up BPF subsystem, as functionality has been disabled at compile time. Jan 14 01:30:48.575000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-nsresourced comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:48.575045 systemd[1]: Started systemd-nsresourced.service - Namespace Resource Manager. Jan 14 01:30:48.591009 kernel: loop5: detected capacity change from 0 to 50784 Jan 14 01:30:48.620398 kernel: loop6: detected capacity change from 0 to 224512 Jan 14 01:30:48.661008 kernel: loop7: detected capacity change from 0 to 1656 Jan 14 01:30:48.678043 kernel: loop1: detected capacity change from 0 to 111560 Jan 14 01:30:48.681797 systemd-oomd[1342]: No swap; memory pressure usage will be degraded Jan 14 01:30:48.683308 systemd[1]: Started systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer. Jan 14 01:30:48.683000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-oomd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:48.689221 systemd-resolved[1343]: Positive Trust Anchors: Jan 14 01:30:48.689464 systemd-resolved[1343]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 14 01:30:48.689520 systemd-resolved[1343]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Jan 14 01:30:48.689578 systemd-resolved[1343]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 14 01:30:48.730413 (sd-merge)[1363]: Using extensions 'containerd-flatcar.raw', 'docker-flatcar.raw', 'kubernetes.raw', 'oem-stackit.raw'. Jan 14 01:30:48.734276 (sd-merge)[1363]: Merged extensions into '/usr'. Jan 14 01:30:48.734454 systemd-resolved[1343]: Using system hostname 'ci-4578-0-0-p-557efd55ff'. Jan 14 01:30:48.740342 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 14 01:30:48.743215 kernel: kauditd_printk_skb: 104 callbacks suppressed Jan 14 01:30:48.743281 kernel: audit: type=1130 audit(1768354248.740:149): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:48.740000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:48.743144 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 14 01:30:48.745282 systemd[1]: Reload requested from client PID 1321 ('systemd-sysext') (unit systemd-sysext.service)... Jan 14 01:30:48.745356 systemd[1]: Reloading... Jan 14 01:30:48.836016 zram_generator::config[1396]: No configuration found. Jan 14 01:30:49.004075 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Jan 14 01:30:49.004555 systemd[1]: Reloading finished in 258 ms. Jan 14 01:30:49.038966 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Jan 14 01:30:49.038000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-hwdb-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:49.042432 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Jan 14 01:30:49.050027 kernel: audit: type=1130 audit(1768354249.038:150): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-hwdb-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:49.049849 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Jan 14 01:30:49.055046 kernel: audit: type=1334 audit(1768354249.049:151): prog-id=8 op=UNLOAD Jan 14 01:30:49.049000 audit: BPF prog-id=8 op=UNLOAD Jan 14 01:30:49.053835 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 14 01:30:49.049000 audit: BPF prog-id=7 op=UNLOAD Jan 14 01:30:49.051000 audit: BPF prog-id=28 op=LOAD Jan 14 01:30:49.057390 kernel: audit: type=1334 audit(1768354249.049:152): prog-id=7 op=UNLOAD Jan 14 01:30:49.057425 kernel: audit: type=1334 audit(1768354249.051:153): prog-id=28 op=LOAD Jan 14 01:30:49.051000 audit: BPF prog-id=29 op=LOAD Jan 14 01:30:49.054000 audit: BPF prog-id=30 op=LOAD Jan 14 01:30:49.054000 audit: BPF prog-id=31 op=LOAD Jan 14 01:30:49.062380 kernel: audit: type=1334 audit(1768354249.051:154): prog-id=29 op=LOAD Jan 14 01:30:49.062419 kernel: audit: type=1334 audit(1768354249.054:155): prog-id=30 op=LOAD Jan 14 01:30:49.062438 kernel: audit: type=1334 audit(1768354249.054:156): prog-id=31 op=LOAD Jan 14 01:30:49.054000 audit: BPF prog-id=32 op=LOAD Jan 14 01:30:49.068031 kernel: audit: type=1334 audit(1768354249.054:157): prog-id=32 op=LOAD Jan 14 01:30:49.057000 audit: BPF prog-id=18 op=UNLOAD Jan 14 01:30:49.071065 kernel: audit: type=1334 audit(1768354249.057:158): prog-id=18 op=UNLOAD Jan 14 01:30:49.057000 audit: BPF prog-id=19 op=UNLOAD Jan 14 01:30:49.057000 audit: BPF prog-id=20 op=UNLOAD Jan 14 01:30:49.064000 audit: BPF prog-id=33 op=LOAD Jan 14 01:30:49.064000 audit: BPF prog-id=22 op=UNLOAD Jan 14 01:30:49.064000 audit: BPF prog-id=34 op=LOAD Jan 14 01:30:49.064000 audit: BPF prog-id=35 op=LOAD Jan 14 01:30:49.064000 audit: BPF prog-id=23 op=UNLOAD Jan 14 01:30:49.064000 audit: BPF prog-id=24 op=UNLOAD Jan 14 01:30:49.066000 audit: BPF prog-id=36 op=LOAD Jan 14 01:30:49.066000 audit: BPF prog-id=21 op=UNLOAD Jan 14 01:30:49.067000 audit: BPF prog-id=37 op=LOAD Jan 14 01:30:49.067000 audit: BPF prog-id=25 op=UNLOAD Jan 14 01:30:49.067000 audit: BPF prog-id=38 op=LOAD Jan 14 01:30:49.067000 audit: BPF prog-id=39 op=LOAD Jan 14 01:30:49.069000 audit: BPF prog-id=26 op=UNLOAD Jan 14 01:30:49.069000 audit: BPF prog-id=27 op=UNLOAD Jan 14 01:30:49.071000 audit: BPF prog-id=40 op=LOAD Jan 14 01:30:49.071000 audit: BPF prog-id=15 op=UNLOAD Jan 14 01:30:49.071000 audit: BPF prog-id=41 op=LOAD Jan 14 01:30:49.071000 audit: BPF prog-id=42 op=LOAD Jan 14 01:30:49.072000 audit: BPF prog-id=16 op=UNLOAD Jan 14 01:30:49.072000 audit: BPF prog-id=17 op=UNLOAD Jan 14 01:30:49.076926 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Jan 14 01:30:49.076000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:49.078084 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Jan 14 01:30:49.078864 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Jan 14 01:30:49.086151 systemd[1]: Starting ensure-sysext.service... Jan 14 01:30:49.089065 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 14 01:30:49.091047 systemd-udevd[1440]: Using default interface naming scheme 'v257'. Jan 14 01:30:49.108405 systemd-tmpfiles[1445]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Jan 14 01:30:49.108428 systemd-tmpfiles[1445]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Jan 14 01:30:49.108651 systemd-tmpfiles[1445]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Jan 14 01:30:49.109589 systemd-tmpfiles[1445]: ACLs are not supported, ignoring. Jan 14 01:30:49.109637 systemd-tmpfiles[1445]: ACLs are not supported, ignoring. Jan 14 01:30:49.113421 systemd[1]: Reload requested from client PID 1444 ('systemctl') (unit ensure-sysext.service)... Jan 14 01:30:49.113515 systemd[1]: Reloading... Jan 14 01:30:49.117528 systemd-tmpfiles[1445]: Detected autofs mount point /boot during canonicalization of boot. Jan 14 01:30:49.117538 systemd-tmpfiles[1445]: Skipping /boot Jan 14 01:30:49.124333 systemd-tmpfiles[1445]: Detected autofs mount point /boot during canonicalization of boot. Jan 14 01:30:49.124345 systemd-tmpfiles[1445]: Skipping /boot Jan 14 01:30:49.206011 zram_generator::config[1486]: No configuration found. Jan 14 01:30:49.310037 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input4 Jan 14 01:30:49.315004 kernel: ACPI: button: Power Button [PWRF] Jan 14 01:30:49.343007 kernel: mousedev: PS/2 mouse device common for all mice Jan 14 01:30:49.410012 kernel: i801_smbus 0000:00:1f.3: Enabling SMBus device Jan 14 01:30:49.410309 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Jan 14 01:30:49.412000 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Jan 14 01:30:49.429010 kernel: [drm] pci: virtio-vga detected at 0000:00:01.0 Jan 14 01:30:49.433012 kernel: Console: switching to colour dummy device 80x25 Jan 14 01:30:49.436704 kernel: virtio-pci 0000:00:01.0: vgaarb: deactivate vga console Jan 14 01:30:49.436969 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Jan 14 01:30:49.436998 kernel: [drm] features: -context_init Jan 14 01:30:49.442008 kernel: [drm] number of scanouts: 1 Jan 14 01:30:49.442055 kernel: [drm] number of cap sets: 0 Jan 14 01:30:49.443002 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0 Jan 14 01:30:49.446003 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device Jan 14 01:30:49.451906 kernel: Console: switching to colour frame buffer device 160x50 Jan 14 01:30:49.456012 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Jan 14 01:30:49.500434 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Jan 14 01:30:49.500524 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Jan 14 01:30:49.503444 systemd[1]: Reloading finished in 389 ms. Jan 14 01:30:49.509784 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 14 01:30:49.510000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:49.516088 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 14 01:30:49.515000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:49.542841 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 14 01:30:49.549117 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Jan 14 01:30:49.551359 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Jan 14 01:30:49.556108 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Jan 14 01:30:49.562584 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Jan 14 01:30:49.564000 audit: BPF prog-id=43 op=LOAD Jan 14 01:30:49.569037 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 14 01:30:49.577517 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Jan 14 01:30:49.578000 audit: BPF prog-id=44 op=LOAD Jan 14 01:30:49.578000 audit: BPF prog-id=45 op=LOAD Jan 14 01:30:49.578000 audit: BPF prog-id=28 op=UNLOAD Jan 14 01:30:49.578000 audit: BPF prog-id=29 op=UNLOAD Jan 14 01:30:49.579000 audit: BPF prog-id=46 op=LOAD Jan 14 01:30:49.579000 audit: BPF prog-id=36 op=UNLOAD Jan 14 01:30:49.580000 audit: BPF prog-id=47 op=LOAD Jan 14 01:30:49.580000 audit: BPF prog-id=33 op=UNLOAD Jan 14 01:30:49.580000 audit: BPF prog-id=48 op=LOAD Jan 14 01:30:49.580000 audit: BPF prog-id=49 op=LOAD Jan 14 01:30:49.580000 audit: BPF prog-id=34 op=UNLOAD Jan 14 01:30:49.580000 audit: BPF prog-id=35 op=UNLOAD Jan 14 01:30:49.584000 audit: BPF prog-id=50 op=LOAD Jan 14 01:30:49.584000 audit: BPF prog-id=40 op=UNLOAD Jan 14 01:30:49.584000 audit: BPF prog-id=51 op=LOAD Jan 14 01:30:49.584000 audit: BPF prog-id=52 op=LOAD Jan 14 01:30:49.584000 audit: BPF prog-id=41 op=UNLOAD Jan 14 01:30:49.584000 audit: BPF prog-id=42 op=UNLOAD Jan 14 01:30:49.585000 audit: BPF prog-id=53 op=LOAD Jan 14 01:30:49.585000 audit: BPF prog-id=30 op=UNLOAD Jan 14 01:30:49.585000 audit: BPF prog-id=54 op=LOAD Jan 14 01:30:49.585000 audit: BPF prog-id=55 op=LOAD Jan 14 01:30:49.585000 audit: BPF prog-id=31 op=UNLOAD Jan 14 01:30:49.585000 audit: BPF prog-id=32 op=UNLOAD Jan 14 01:30:49.585000 audit: BPF prog-id=56 op=LOAD Jan 14 01:30:49.585000 audit: BPF prog-id=37 op=UNLOAD Jan 14 01:30:49.585000 audit: BPF prog-id=57 op=LOAD Jan 14 01:30:49.585000 audit: BPF prog-id=58 op=LOAD Jan 14 01:30:49.585000 audit: BPF prog-id=38 op=UNLOAD Jan 14 01:30:49.585000 audit: BPF prog-id=39 op=UNLOAD Jan 14 01:30:49.611254 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 14 01:30:49.634870 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 14 01:30:49.635066 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 14 01:30:49.636691 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 14 01:30:49.640809 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 14 01:30:49.643199 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 14 01:30:49.643360 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 14 01:30:49.643545 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 14 01:30:49.643663 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 14 01:30:49.643947 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 14 01:30:49.653153 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 14 01:30:49.653346 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 14 01:30:49.653542 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 14 01:30:49.653707 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 14 01:30:49.653826 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 14 01:30:49.653941 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 14 01:30:49.661000 audit[1568]: SYSTEM_BOOT pid=1568 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg=' comm="systemd-update-utmp" exe="/usr/lib/systemd/systemd-update-utmp" hostname=? addr=? terminal=? res=success' Jan 14 01:30:49.659445 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 14 01:30:49.659670 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 14 01:30:49.663280 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 14 01:30:49.670319 systemd[1]: Starting modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm... Jan 14 01:30:49.670502 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 14 01:30:49.670683 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 14 01:30:49.670816 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 14 01:30:49.671049 systemd[1]: Reached target time-set.target - System Time Set. Jan 14 01:30:49.671180 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 14 01:30:49.676000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=ensure-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:49.677659 systemd[1]: Finished ensure-sysext.service. Jan 14 01:30:49.690682 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Jan 14 01:30:49.691000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-update-utmp comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:49.703167 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 14 01:30:49.702000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:49.702000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:49.703395 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 14 01:30:49.705870 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 14 01:30:49.706566 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 14 01:30:49.708000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:49.708000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:49.708699 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 14 01:30:49.709610 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 14 01:30:49.711000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:49.711000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:49.713280 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 14 01:30:49.713522 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 14 01:30:49.724165 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Jan 14 01:30:49.725000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-catalog-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:49.732077 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 14 01:30:49.732307 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 14 01:30:49.734000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:49.734000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:49.738470 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Jan 14 01:30:49.739000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck@dev-disk-by\x2dlabel-OEM comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:49.739595 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 14 01:30:49.740300 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 14 01:30:49.741000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:49.741000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:30:49.746494 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 14 01:30:49.764851 kernel: pps_core: LinuxPPS API ver. 1 registered Jan 14 01:30:49.764928 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Jan 14 01:30:49.771006 kernel: PTP clock support registered Jan 14 01:30:49.790000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=add_rule key=(null) list=5 res=1 Jan 14 01:30:49.790000 audit[1612]: SYSCALL arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7ffd6e7018d0 a2=420 a3=0 items=0 ppid=1560 pid=1612 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:30:49.790000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 14 01:30:49.791927 augenrules[1612]: No rules Jan 14 01:30:49.794828 systemd[1]: audit-rules.service: Deactivated successfully. Jan 14 01:30:49.795478 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 14 01:30:49.807320 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 14 01:30:49.807571 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 14 01:30:49.811621 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 14 01:30:49.841074 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Jan 14 01:30:49.841711 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Jan 14 01:30:49.885366 systemd[1]: modprobe@ptp_kvm.service: Deactivated successfully. Jan 14 01:30:49.886062 systemd[1]: Finished modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm. Jan 14 01:30:49.900212 systemd-networkd[1567]: lo: Link UP Jan 14 01:30:49.900221 systemd-networkd[1567]: lo: Gained carrier Jan 14 01:30:49.902378 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 14 01:30:49.902775 systemd-networkd[1567]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 14 01:30:49.902780 systemd-networkd[1567]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 14 01:30:49.905041 systemd[1]: Reached target network.target - Network. Jan 14 01:30:49.908020 systemd-networkd[1567]: eth0: Link UP Jan 14 01:30:49.908279 systemd-networkd[1567]: eth0: Gained carrier Jan 14 01:30:49.908304 systemd-networkd[1567]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 14 01:30:49.908684 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Jan 14 01:30:49.912392 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Jan 14 01:30:49.926047 systemd-networkd[1567]: eth0: DHCPv4 address 10.0.22.183/25, gateway 10.0.22.129 acquired from 10.0.22.129 Jan 14 01:30:49.956730 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Jan 14 01:30:49.958624 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 14 01:30:50.476175 ldconfig[1564]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Jan 14 01:30:50.485583 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Jan 14 01:30:50.489212 systemd[1]: Starting systemd-update-done.service - Update is Completed... Jan 14 01:30:50.511706 systemd[1]: Finished systemd-update-done.service - Update is Completed. Jan 14 01:30:50.513044 systemd[1]: Reached target sysinit.target - System Initialization. Jan 14 01:30:50.513600 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Jan 14 01:30:50.514222 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Jan 14 01:30:50.515718 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Jan 14 01:30:50.516756 systemd[1]: Started logrotate.timer - Daily rotation of log files. Jan 14 01:30:50.518230 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Jan 14 01:30:50.518728 systemd[1]: Started systemd-sysupdate-reboot.timer - Reboot Automatically After System Update. Jan 14 01:30:50.519309 systemd[1]: Started systemd-sysupdate.timer - Automatic System Update. Jan 14 01:30:50.519787 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Jan 14 01:30:50.520280 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Jan 14 01:30:50.520312 systemd[1]: Reached target paths.target - Path Units. Jan 14 01:30:50.520752 systemd[1]: Reached target timers.target - Timer Units. Jan 14 01:30:50.525064 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Jan 14 01:30:50.530455 systemd[1]: Starting docker.socket - Docker Socket for the API... Jan 14 01:30:50.535259 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Jan 14 01:30:50.536929 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Jan 14 01:30:50.538861 systemd[1]: Reached target ssh-access.target - SSH Access Available. Jan 14 01:30:50.543122 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Jan 14 01:30:50.545479 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Jan 14 01:30:50.548355 systemd[1]: Listening on docker.socket - Docker Socket for the API. Jan 14 01:30:50.551774 systemd[1]: Reached target sockets.target - Socket Units. Jan 14 01:30:50.555061 systemd[1]: Reached target basic.target - Basic System. Jan 14 01:30:50.556124 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Jan 14 01:30:50.556164 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Jan 14 01:30:50.558583 systemd[1]: Starting chronyd.service - NTP client/server... Jan 14 01:30:50.562328 systemd[1]: Starting containerd.service - containerd container runtime... Jan 14 01:30:50.567814 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Jan 14 01:30:50.569641 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Jan 14 01:30:50.581417 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Jan 14 01:30:50.585204 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Jan 14 01:30:50.589522 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Jan 14 01:30:50.590765 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Jan 14 01:30:50.600425 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Jan 14 01:30:50.603745 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Jan 14 01:30:50.605001 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 14 01:30:50.621223 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Jan 14 01:30:50.626175 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Jan 14 01:30:50.631199 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Jan 14 01:30:50.632334 google_oslogin_nss_cache[1644]: oslogin_cache_refresh[1644]: Refreshing passwd entry cache Jan 14 01:30:50.633144 jq[1642]: false Jan 14 01:30:50.636036 oslogin_cache_refresh[1644]: Refreshing passwd entry cache Jan 14 01:30:50.641133 systemd[1]: Starting systemd-logind.service - User Login Management... Jan 14 01:30:50.642059 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Jan 14 01:30:50.642564 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Jan 14 01:30:50.644811 systemd[1]: Starting update-engine.service - Update Engine... Jan 14 01:30:50.646924 google_oslogin_nss_cache[1644]: oslogin_cache_refresh[1644]: Failure getting users, quitting Jan 14 01:30:50.646968 oslogin_cache_refresh[1644]: Failure getting users, quitting Jan 14 01:30:50.647048 google_oslogin_nss_cache[1644]: oslogin_cache_refresh[1644]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Jan 14 01:30:50.647072 oslogin_cache_refresh[1644]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Jan 14 01:30:50.647143 google_oslogin_nss_cache[1644]: oslogin_cache_refresh[1644]: Refreshing group entry cache Jan 14 01:30:50.647165 oslogin_cache_refresh[1644]: Refreshing group entry cache Jan 14 01:30:50.655806 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Jan 14 01:30:50.656363 google_oslogin_nss_cache[1644]: oslogin_cache_refresh[1644]: Failure getting groups, quitting Jan 14 01:30:50.656363 google_oslogin_nss_cache[1644]: oslogin_cache_refresh[1644]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Jan 14 01:30:50.656308 oslogin_cache_refresh[1644]: Failure getting groups, quitting Jan 14 01:30:50.656317 oslogin_cache_refresh[1644]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Jan 14 01:30:50.667136 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Jan 14 01:30:50.668863 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Jan 14 01:30:50.670129 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Jan 14 01:30:50.670451 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Jan 14 01:30:50.670639 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Jan 14 01:30:50.676522 chronyd[1637]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +NTS +SECHASH +IPV6 -DEBUG) Jan 14 01:30:50.677556 chronyd[1637]: Loaded seccomp filter (level 2) Jan 14 01:30:50.680377 systemd[1]: Started chronyd.service - NTP client/server. Jan 14 01:30:50.685003 extend-filesystems[1643]: Found /dev/vda6 Jan 14 01:30:50.690685 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Jan 14 01:30:50.692691 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Jan 14 01:30:50.698249 systemd[1]: motdgen.service: Deactivated successfully. Jan 14 01:30:50.701137 extend-filesystems[1643]: Found /dev/vda9 Jan 14 01:30:50.698491 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Jan 14 01:30:50.709202 extend-filesystems[1643]: Checking size of /dev/vda9 Jan 14 01:30:50.715735 jq[1657]: true Jan 14 01:30:50.721106 update_engine[1653]: I20260114 01:30:50.720372 1653 main.cc:92] Flatcar Update Engine starting Jan 14 01:30:50.735702 extend-filesystems[1643]: Resized partition /dev/vda9 Jan 14 01:30:50.742824 extend-filesystems[1692]: resize2fs 1.47.3 (8-Jul-2025) Jan 14 01:30:50.751269 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 11516923 blocks Jan 14 01:30:50.764543 jq[1689]: true Jan 14 01:30:50.770880 dbus-daemon[1640]: [system] SELinux support is enabled Jan 14 01:30:50.771646 tar[1670]: linux-amd64/LICENSE Jan 14 01:30:50.776210 systemd[1]: Started dbus.service - D-Bus System Message Bus. Jan 14 01:30:50.781279 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Jan 14 01:30:50.781305 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Jan 14 01:30:50.783856 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Jan 14 01:30:50.783873 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Jan 14 01:30:50.787001 tar[1670]: linux-amd64/helm Jan 14 01:30:50.808658 update_engine[1653]: I20260114 01:30:50.808606 1653 update_check_scheduler.cc:74] Next update check in 3m24s Jan 14 01:30:50.808742 systemd[1]: Started update-engine.service - Update Engine. Jan 14 01:30:50.816934 systemd-logind[1652]: New seat seat0. Jan 14 01:30:50.817421 systemd[1]: Started locksmithd.service - Cluster reboot manager. Jan 14 01:30:50.830838 systemd-logind[1652]: Watching system buttons on /dev/input/event3 (Power Button) Jan 14 01:30:50.830860 systemd-logind[1652]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Jan 14 01:30:50.831127 systemd[1]: Started systemd-logind.service - User Login Management. Jan 14 01:30:50.999131 systemd-networkd[1567]: eth0: Gained IPv6LL Jan 14 01:30:51.034412 containerd[1684]: time="2026-01-14T01:30:51Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Jan 14 01:30:51.005006 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Jan 14 01:30:51.005946 systemd[1]: Reached target network-online.target - Network is Online. Jan 14 01:30:51.009355 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 01:30:51.013376 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Jan 14 01:30:51.047237 locksmithd[1700]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Jan 14 01:30:51.059869 bash[1717]: Updated "/home/core/.ssh/authorized_keys" Jan 14 01:30:51.061277 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Jan 14 01:30:51.067428 containerd[1684]: time="2026-01-14T01:30:51.067389367Z" level=info msg="starting containerd" revision=fcd43222d6b07379a4be9786bda52438f0dd16a1 version=v2.1.5 Jan 14 01:30:51.071763 systemd[1]: Starting sshkeys.service... Jan 14 01:30:51.095979 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Jan 14 01:30:51.112644 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Jan 14 01:30:51.117897 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Jan 14 01:30:51.127716 containerd[1684]: time="2026-01-14T01:30:51.127661075Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="9.039µs" Jan 14 01:30:51.127716 containerd[1684]: time="2026-01-14T01:30:51.127710322Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Jan 14 01:30:51.127798 containerd[1684]: time="2026-01-14T01:30:51.127752620Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Jan 14 01:30:51.127798 containerd[1684]: time="2026-01-14T01:30:51.127765964Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Jan 14 01:30:51.127912 containerd[1684]: time="2026-01-14T01:30:51.127895902Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Jan 14 01:30:51.127931 containerd[1684]: time="2026-01-14T01:30:51.127917954Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jan 14 01:30:51.127982 containerd[1684]: time="2026-01-14T01:30:51.127969393Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jan 14 01:30:51.130698 containerd[1684]: time="2026-01-14T01:30:51.130674492Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jan 14 01:30:51.130950 containerd[1684]: time="2026-01-14T01:30:51.130932257Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jan 14 01:30:51.130973 containerd[1684]: time="2026-01-14T01:30:51.130952422Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jan 14 01:30:51.130973 containerd[1684]: time="2026-01-14T01:30:51.130963071Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jan 14 01:30:51.130973 containerd[1684]: time="2026-01-14T01:30:51.130970573Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Jan 14 01:30:51.143428 containerd[1684]: time="2026-01-14T01:30:51.141909080Z" level=info msg="skip loading plugin" error="EROFS unsupported, please `modprobe erofs`: skip plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Jan 14 01:30:51.143428 containerd[1684]: time="2026-01-14T01:30:51.141946710Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Jan 14 01:30:51.143428 containerd[1684]: time="2026-01-14T01:30:51.142076245Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Jan 14 01:30:51.143428 containerd[1684]: time="2026-01-14T01:30:51.142244751Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jan 14 01:30:51.143428 containerd[1684]: time="2026-01-14T01:30:51.142268837Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jan 14 01:30:51.143428 containerd[1684]: time="2026-01-14T01:30:51.142278420Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Jan 14 01:30:51.143428 containerd[1684]: time="2026-01-14T01:30:51.142304890Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Jan 14 01:30:51.148025 containerd[1684]: time="2026-01-14T01:30:51.147195108Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Jan 14 01:30:51.148025 containerd[1684]: time="2026-01-14T01:30:51.147299613Z" level=info msg="metadata content store policy set" policy=shared Jan 14 01:30:51.152014 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 14 01:30:51.222016 containerd[1684]: time="2026-01-14T01:30:51.221112926Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Jan 14 01:30:51.222016 containerd[1684]: time="2026-01-14T01:30:51.221188379Z" level=info msg="loading plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Jan 14 01:30:51.222016 containerd[1684]: time="2026-01-14T01:30:51.221313929Z" level=info msg="skip loading plugin" error="could not find mkfs.erofs: exec: \"mkfs.erofs\": executable file not found in $PATH: skip plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Jan 14 01:30:51.222016 containerd[1684]: time="2026-01-14T01:30:51.221327064Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Jan 14 01:30:51.222016 containerd[1684]: time="2026-01-14T01:30:51.221339106Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Jan 14 01:30:51.222016 containerd[1684]: time="2026-01-14T01:30:51.221350955Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Jan 14 01:30:51.222016 containerd[1684]: time="2026-01-14T01:30:51.221365837Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Jan 14 01:30:51.222016 containerd[1684]: time="2026-01-14T01:30:51.221375931Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Jan 14 01:30:51.222016 containerd[1684]: time="2026-01-14T01:30:51.221389434Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Jan 14 01:30:51.222016 containerd[1684]: time="2026-01-14T01:30:51.221399318Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Jan 14 01:30:51.222016 containerd[1684]: time="2026-01-14T01:30:51.221413260Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Jan 14 01:30:51.222016 containerd[1684]: time="2026-01-14T01:30:51.221422465Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Jan 14 01:30:51.222016 containerd[1684]: time="2026-01-14T01:30:51.221432175Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Jan 14 01:30:51.222016 containerd[1684]: time="2026-01-14T01:30:51.221442221Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Jan 14 01:30:51.222325 containerd[1684]: time="2026-01-14T01:30:51.221570366Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Jan 14 01:30:51.222325 containerd[1684]: time="2026-01-14T01:30:51.221586625Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Jan 14 01:30:51.222325 containerd[1684]: time="2026-01-14T01:30:51.221599072Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Jan 14 01:30:51.222325 containerd[1684]: time="2026-01-14T01:30:51.221620697Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Jan 14 01:30:51.222325 containerd[1684]: time="2026-01-14T01:30:51.221631713Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Jan 14 01:30:51.222325 containerd[1684]: time="2026-01-14T01:30:51.221641940Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Jan 14 01:30:51.222325 containerd[1684]: time="2026-01-14T01:30:51.221652986Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Jan 14 01:30:51.222325 containerd[1684]: time="2026-01-14T01:30:51.221662494Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Jan 14 01:30:51.222325 containerd[1684]: time="2026-01-14T01:30:51.221671699Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Jan 14 01:30:51.222325 containerd[1684]: time="2026-01-14T01:30:51.221680654Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Jan 14 01:30:51.222325 containerd[1684]: time="2026-01-14T01:30:51.221690441Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Jan 14 01:30:51.222325 containerd[1684]: time="2026-01-14T01:30:51.221711967Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Jan 14 01:30:51.222325 containerd[1684]: time="2026-01-14T01:30:51.221764143Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Jan 14 01:30:51.222325 containerd[1684]: time="2026-01-14T01:30:51.221776513Z" level=info msg="Start snapshots syncer" Jan 14 01:30:51.222325 containerd[1684]: time="2026-01-14T01:30:51.221796635Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Jan 14 01:30:51.225159 containerd[1684]: time="2026-01-14T01:30:51.224247046Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"cgroupWritable\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"\",\"binDirs\":[\"/opt/cni/bin\"],\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogLineSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Jan 14 01:30:51.225159 containerd[1684]: time="2026-01-14T01:30:51.224301673Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Jan 14 01:30:51.225281 containerd[1684]: time="2026-01-14T01:30:51.224354367Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Jan 14 01:30:51.225281 containerd[1684]: time="2026-01-14T01:30:51.224464452Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Jan 14 01:30:51.225281 containerd[1684]: time="2026-01-14T01:30:51.224482887Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Jan 14 01:30:51.225281 containerd[1684]: time="2026-01-14T01:30:51.224492656Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Jan 14 01:30:51.225281 containerd[1684]: time="2026-01-14T01:30:51.224503436Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Jan 14 01:30:51.225281 containerd[1684]: time="2026-01-14T01:30:51.224514157Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Jan 14 01:30:51.225281 containerd[1684]: time="2026-01-14T01:30:51.224523092Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Jan 14 01:30:51.225281 containerd[1684]: time="2026-01-14T01:30:51.224532851Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Jan 14 01:30:51.225281 containerd[1684]: time="2026-01-14T01:30:51.224541437Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Jan 14 01:30:51.225281 containerd[1684]: time="2026-01-14T01:30:51.224550714Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Jan 14 01:30:51.225281 containerd[1684]: time="2026-01-14T01:30:51.224570271Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jan 14 01:30:51.225281 containerd[1684]: time="2026-01-14T01:30:51.224582032Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jan 14 01:30:51.225281 containerd[1684]: time="2026-01-14T01:30:51.224590898Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jan 14 01:30:51.225496 containerd[1684]: time="2026-01-14T01:30:51.224599124Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jan 14 01:30:51.225496 containerd[1684]: time="2026-01-14T01:30:51.224605850Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Jan 14 01:30:51.225496 containerd[1684]: time="2026-01-14T01:30:51.224613859Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Jan 14 01:30:51.225496 containerd[1684]: time="2026-01-14T01:30:51.224631528Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Jan 14 01:30:51.225496 containerd[1684]: time="2026-01-14T01:30:51.224642037Z" level=info msg="runtime interface created" Jan 14 01:30:51.225496 containerd[1684]: time="2026-01-14T01:30:51.224646996Z" level=info msg="created NRI interface" Jan 14 01:30:51.225496 containerd[1684]: time="2026-01-14T01:30:51.224653917Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Jan 14 01:30:51.225496 containerd[1684]: time="2026-01-14T01:30:51.224665213Z" level=info msg="Connect containerd service" Jan 14 01:30:51.225496 containerd[1684]: time="2026-01-14T01:30:51.224682892Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Jan 14 01:30:51.228006 containerd[1684]: time="2026-01-14T01:30:51.227435118Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 14 01:30:51.414921 containerd[1684]: time="2026-01-14T01:30:51.412739880Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Jan 14 01:30:51.414921 containerd[1684]: time="2026-01-14T01:30:51.412796040Z" level=info msg=serving... address=/run/containerd/containerd.sock Jan 14 01:30:51.414921 containerd[1684]: time="2026-01-14T01:30:51.412819155Z" level=info msg="Start subscribing containerd event" Jan 14 01:30:51.414921 containerd[1684]: time="2026-01-14T01:30:51.412841782Z" level=info msg="Start recovering state" Jan 14 01:30:51.414921 containerd[1684]: time="2026-01-14T01:30:51.412914902Z" level=info msg="Start event monitor" Jan 14 01:30:51.414921 containerd[1684]: time="2026-01-14T01:30:51.412924891Z" level=info msg="Start cni network conf syncer for default" Jan 14 01:30:51.414921 containerd[1684]: time="2026-01-14T01:30:51.412932060Z" level=info msg="Start streaming server" Jan 14 01:30:51.414921 containerd[1684]: time="2026-01-14T01:30:51.412939712Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Jan 14 01:30:51.414921 containerd[1684]: time="2026-01-14T01:30:51.412947175Z" level=info msg="runtime interface starting up..." Jan 14 01:30:51.414921 containerd[1684]: time="2026-01-14T01:30:51.412952528Z" level=info msg="starting plugins..." Jan 14 01:30:51.414921 containerd[1684]: time="2026-01-14T01:30:51.412965908Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Jan 14 01:30:51.414921 containerd[1684]: time="2026-01-14T01:30:51.414054815Z" level=info msg="containerd successfully booted in 0.381926s" Jan 14 01:30:51.414238 systemd[1]: Started containerd.service - containerd container runtime. Jan 14 01:30:51.463256 kernel: EXT4-fs (vda9): resized filesystem to 11516923 Jan 14 01:30:51.550145 extend-filesystems[1692]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Jan 14 01:30:51.550145 extend-filesystems[1692]: old_desc_blocks = 1, new_desc_blocks = 6 Jan 14 01:30:51.550145 extend-filesystems[1692]: The filesystem on /dev/vda9 is now 11516923 (4k) blocks long. Jan 14 01:30:51.557701 extend-filesystems[1643]: Resized filesystem in /dev/vda9 Jan 14 01:30:51.554140 systemd[1]: extend-filesystems.service: Deactivated successfully. Jan 14 01:30:51.554404 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Jan 14 01:30:51.577122 sshd_keygen[1686]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Jan 14 01:30:51.599758 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Jan 14 01:30:51.605122 systemd[1]: Starting issuegen.service - Generate /run/issue... Jan 14 01:30:51.628379 systemd[1]: issuegen.service: Deactivated successfully. Jan 14 01:30:51.628651 systemd[1]: Finished issuegen.service - Generate /run/issue. Jan 14 01:30:51.632642 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Jan 14 01:30:51.638279 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 14 01:30:51.659648 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Jan 14 01:30:51.662601 systemd[1]: Started getty@tty1.service - Getty on tty1. Jan 14 01:30:51.667314 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Jan 14 01:30:51.668704 systemd[1]: Reached target getty.target - Login Prompts. Jan 14 01:30:51.745372 tar[1670]: linux-amd64/README.md Jan 14 01:30:51.761068 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Jan 14 01:30:52.169045 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 14 01:30:52.573123 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 01:30:52.584267 (kubelet)[1783]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 14 01:30:53.415837 kubelet[1783]: E0114 01:30:53.415780 1783 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 14 01:30:53.417895 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 14 01:30:53.418038 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 14 01:30:53.418671 systemd[1]: kubelet.service: Consumed 1.052s CPU time, 266.7M memory peak. Jan 14 01:30:53.654067 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 14 01:30:54.182015 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 14 01:30:57.013864 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Jan 14 01:30:57.016772 systemd[1]: Started sshd@0-10.0.22.183:22-68.220.241.50:49598.service - OpenSSH per-connection server daemon (68.220.241.50:49598). Jan 14 01:30:57.599076 sshd[1795]: Accepted publickey for core from 68.220.241.50 port 49598 ssh2: RSA SHA256:sbFoGtRFAtOHBsi8EGjpgLe6A0GZZFX7Mtyn33eMzVs Jan 14 01:30:57.605150 sshd-session[1795]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:30:57.634239 systemd-logind[1652]: New session 1 of user core. Jan 14 01:30:57.637241 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Jan 14 01:30:57.639010 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Jan 14 01:30:57.663039 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 14 01:30:57.678510 coreos-metadata[1639]: Jan 14 01:30:57.678 WARN failed to locate config-drive, using the metadata service API instead Jan 14 01:30:57.695257 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Jan 14 01:30:57.703542 systemd[1]: Starting user@500.service - User Manager for UID 500... Jan 14 01:30:57.715308 coreos-metadata[1639]: Jan 14 01:30:57.714 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Jan 14 01:30:57.727016 (systemd)[1805]: pam_unix(systemd-user:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:30:57.731771 systemd-logind[1652]: New session 2 of user core. Jan 14 01:30:57.899525 systemd[1805]: Queued start job for default target default.target. Jan 14 01:30:57.907339 systemd[1805]: Created slice app.slice - User Application Slice. Jan 14 01:30:57.907370 systemd[1805]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of User's Temporary Directories. Jan 14 01:30:57.907383 systemd[1805]: Reached target paths.target - Paths. Jan 14 01:30:57.907428 systemd[1805]: Reached target timers.target - Timers. Jan 14 01:30:57.908653 systemd[1805]: Starting dbus.socket - D-Bus User Message Bus Socket... Jan 14 01:30:57.911123 systemd[1805]: Starting systemd-tmpfiles-setup.service - Create User Files and Directories... Jan 14 01:30:57.924512 systemd[1805]: Finished systemd-tmpfiles-setup.service - Create User Files and Directories. Jan 14 01:30:57.930205 systemd[1805]: Listening on dbus.socket - D-Bus User Message Bus Socket. Jan 14 01:30:57.931176 systemd[1805]: Reached target sockets.target - Sockets. Jan 14 01:30:57.931225 systemd[1805]: Reached target basic.target - Basic System. Jan 14 01:30:57.931259 systemd[1805]: Reached target default.target - Main User Target. Jan 14 01:30:57.931285 systemd[1805]: Startup finished in 189ms. Jan 14 01:30:57.932359 systemd[1]: Started user@500.service - User Manager for UID 500. Jan 14 01:30:57.940210 systemd[1]: Started session-1.scope - Session 1 of User core. Jan 14 01:30:58.188178 coreos-metadata[1639]: Jan 14 01:30:58.187 INFO Fetch successful Jan 14 01:30:58.188178 coreos-metadata[1639]: Jan 14 01:30:58.187 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Jan 14 01:30:58.194064 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 14 01:30:58.207557 coreos-metadata[1740]: Jan 14 01:30:58.207 WARN failed to locate config-drive, using the metadata service API instead Jan 14 01:30:58.230900 coreos-metadata[1740]: Jan 14 01:30:58.230 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Jan 14 01:30:58.255694 systemd[1]: Started sshd@1-10.0.22.183:22-68.220.241.50:49604.service - OpenSSH per-connection server daemon (68.220.241.50:49604). Jan 14 01:30:58.432976 coreos-metadata[1639]: Jan 14 01:30:58.432 INFO Fetch successful Jan 14 01:30:58.432976 coreos-metadata[1639]: Jan 14 01:30:58.432 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Jan 14 01:30:58.435281 coreos-metadata[1740]: Jan 14 01:30:58.435 INFO Fetch successful Jan 14 01:30:58.435511 coreos-metadata[1740]: Jan 14 01:30:58.435 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Jan 14 01:30:58.662579 coreos-metadata[1639]: Jan 14 01:30:58.662 INFO Fetch successful Jan 14 01:30:58.662911 coreos-metadata[1639]: Jan 14 01:30:58.662 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Jan 14 01:30:58.749075 coreos-metadata[1740]: Jan 14 01:30:58.749 INFO Fetch successful Jan 14 01:30:58.752421 unknown[1740]: wrote ssh authorized keys file for user: core Jan 14 01:30:58.789307 update-ssh-keys[1825]: Updated "/home/core/.ssh/authorized_keys" Jan 14 01:30:58.791227 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Jan 14 01:30:58.793577 coreos-metadata[1639]: Jan 14 01:30:58.793 INFO Fetch successful Jan 14 01:30:58.793975 coreos-metadata[1639]: Jan 14 01:30:58.793 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Jan 14 01:30:58.793641 systemd[1]: Finished sshkeys.service. Jan 14 01:30:58.827228 sshd[1821]: Accepted publickey for core from 68.220.241.50 port 49604 ssh2: RSA SHA256:sbFoGtRFAtOHBsi8EGjpgLe6A0GZZFX7Mtyn33eMzVs Jan 14 01:30:58.829329 sshd-session[1821]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:30:58.839176 systemd-logind[1652]: New session 3 of user core. Jan 14 01:30:58.845252 systemd[1]: Started session-3.scope - Session 3 of User core. Jan 14 01:30:58.909685 coreos-metadata[1639]: Jan 14 01:30:58.909 INFO Fetch successful Jan 14 01:30:58.909685 coreos-metadata[1639]: Jan 14 01:30:58.909 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Jan 14 01:30:59.030568 coreos-metadata[1639]: Jan 14 01:30:59.030 INFO Fetch successful Jan 14 01:30:59.077223 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Jan 14 01:30:59.078271 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Jan 14 01:30:59.078516 systemd[1]: Reached target multi-user.target - Multi-User System. Jan 14 01:30:59.080065 systemd[1]: Startup finished in 3.789s (kernel) + 14.425s (initrd) + 11.909s (userspace) = 30.124s. Jan 14 01:30:59.137656 sshd[1829]: Connection closed by 68.220.241.50 port 49604 Jan 14 01:30:59.138620 sshd-session[1821]: pam_unix(sshd:session): session closed for user core Jan 14 01:30:59.146136 systemd[1]: sshd@1-10.0.22.183:22-68.220.241.50:49604.service: Deactivated successfully. Jan 14 01:30:59.149541 systemd[1]: session-3.scope: Deactivated successfully. Jan 14 01:30:59.153272 systemd-logind[1652]: Session 3 logged out. Waiting for processes to exit. Jan 14 01:30:59.154728 systemd-logind[1652]: Removed session 3. Jan 14 01:30:59.260385 systemd[1]: Started sshd@2-10.0.22.183:22-68.220.241.50:49610.service - OpenSSH per-connection server daemon (68.220.241.50:49610). Jan 14 01:30:59.846026 sshd[1840]: Accepted publickey for core from 68.220.241.50 port 49610 ssh2: RSA SHA256:sbFoGtRFAtOHBsi8EGjpgLe6A0GZZFX7Mtyn33eMzVs Jan 14 01:30:59.848255 sshd-session[1840]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:30:59.855708 systemd-logind[1652]: New session 4 of user core. Jan 14 01:30:59.862205 systemd[1]: Started session-4.scope - Session 4 of User core. Jan 14 01:31:00.157425 sshd[1844]: Connection closed by 68.220.241.50 port 49610 Jan 14 01:31:00.157297 sshd-session[1840]: pam_unix(sshd:session): session closed for user core Jan 14 01:31:00.164467 systemd[1]: sshd@2-10.0.22.183:22-68.220.241.50:49610.service: Deactivated successfully. Jan 14 01:31:00.167053 systemd[1]: session-4.scope: Deactivated successfully. Jan 14 01:31:00.168894 systemd-logind[1652]: Session 4 logged out. Waiting for processes to exit. Jan 14 01:31:00.170373 systemd-logind[1652]: Removed session 4. Jan 14 01:31:03.623303 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Jan 14 01:31:03.625744 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 01:31:03.771374 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 01:31:03.778577 (kubelet)[1856]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 14 01:31:04.316774 kubelet[1856]: E0114 01:31:04.316674 1856 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 14 01:31:04.324959 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 14 01:31:04.325173 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 14 01:31:04.325960 systemd[1]: kubelet.service: Consumed 212ms CPU time, 110.1M memory peak. Jan 14 01:31:10.287719 systemd[1]: Started sshd@3-10.0.22.183:22-68.220.241.50:35660.service - OpenSSH per-connection server daemon (68.220.241.50:35660). Jan 14 01:31:10.852059 sshd[1865]: Accepted publickey for core from 68.220.241.50 port 35660 ssh2: RSA SHA256:sbFoGtRFAtOHBsi8EGjpgLe6A0GZZFX7Mtyn33eMzVs Jan 14 01:31:10.853523 sshd-session[1865]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:31:10.859397 systemd-logind[1652]: New session 5 of user core. Jan 14 01:31:10.873587 systemd[1]: Started session-5.scope - Session 5 of User core. Jan 14 01:31:11.149035 sshd[1869]: Connection closed by 68.220.241.50 port 35660 Jan 14 01:31:11.147855 sshd-session[1865]: pam_unix(sshd:session): session closed for user core Jan 14 01:31:11.154938 systemd[1]: sshd@3-10.0.22.183:22-68.220.241.50:35660.service: Deactivated successfully. Jan 14 01:31:11.159327 systemd[1]: session-5.scope: Deactivated successfully. Jan 14 01:31:11.161690 systemd-logind[1652]: Session 5 logged out. Waiting for processes to exit. Jan 14 01:31:11.166485 systemd-logind[1652]: Removed session 5. Jan 14 01:31:11.257541 systemd[1]: Started sshd@4-10.0.22.183:22-68.220.241.50:45036.service - OpenSSH per-connection server daemon (68.220.241.50:45036). Jan 14 01:31:11.821694 sshd[1875]: Accepted publickey for core from 68.220.241.50 port 45036 ssh2: RSA SHA256:sbFoGtRFAtOHBsi8EGjpgLe6A0GZZFX7Mtyn33eMzVs Jan 14 01:31:11.823736 sshd-session[1875]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:31:11.833680 systemd-logind[1652]: New session 6 of user core. Jan 14 01:31:11.847371 systemd[1]: Started session-6.scope - Session 6 of User core. Jan 14 01:31:12.114129 sshd[1879]: Connection closed by 68.220.241.50 port 45036 Jan 14 01:31:12.114651 sshd-session[1875]: pam_unix(sshd:session): session closed for user core Jan 14 01:31:12.121543 systemd-logind[1652]: Session 6 logged out. Waiting for processes to exit. Jan 14 01:31:12.121975 systemd[1]: sshd@4-10.0.22.183:22-68.220.241.50:45036.service: Deactivated successfully. Jan 14 01:31:12.124749 systemd[1]: session-6.scope: Deactivated successfully. Jan 14 01:31:12.127170 systemd-logind[1652]: Removed session 6. Jan 14 01:31:12.235599 systemd[1]: Started sshd@5-10.0.22.183:22-68.220.241.50:45038.service - OpenSSH per-connection server daemon (68.220.241.50:45038). Jan 14 01:31:12.780955 sshd[1885]: Accepted publickey for core from 68.220.241.50 port 45038 ssh2: RSA SHA256:sbFoGtRFAtOHBsi8EGjpgLe6A0GZZFX7Mtyn33eMzVs Jan 14 01:31:12.782516 sshd-session[1885]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:31:12.788867 systemd-logind[1652]: New session 7 of user core. Jan 14 01:31:12.795193 systemd[1]: Started session-7.scope - Session 7 of User core. Jan 14 01:31:13.088111 sshd[1889]: Connection closed by 68.220.241.50 port 45038 Jan 14 01:31:13.086727 sshd-session[1885]: pam_unix(sshd:session): session closed for user core Jan 14 01:31:13.094837 systemd[1]: sshd@5-10.0.22.183:22-68.220.241.50:45038.service: Deactivated successfully. Jan 14 01:31:13.094898 systemd-logind[1652]: Session 7 logged out. Waiting for processes to exit. Jan 14 01:31:13.097067 systemd[1]: session-7.scope: Deactivated successfully. Jan 14 01:31:13.099796 systemd-logind[1652]: Removed session 7. Jan 14 01:31:13.197327 systemd[1]: Started sshd@6-10.0.22.183:22-68.220.241.50:45046.service - OpenSSH per-connection server daemon (68.220.241.50:45046). Jan 14 01:31:13.764714 sshd[1895]: Accepted publickey for core from 68.220.241.50 port 45046 ssh2: RSA SHA256:sbFoGtRFAtOHBsi8EGjpgLe6A0GZZFX7Mtyn33eMzVs Jan 14 01:31:13.765732 sshd-session[1895]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:31:13.778257 systemd-logind[1652]: New session 8 of user core. Jan 14 01:31:13.788406 systemd[1]: Started session-8.scope - Session 8 of User core. Jan 14 01:31:13.992782 sudo[1900]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Jan 14 01:31:13.993059 sudo[1900]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 14 01:31:14.004848 sudo[1900]: pam_unix(sudo:session): session closed for user root Jan 14 01:31:14.101582 sshd[1899]: Connection closed by 68.220.241.50 port 45046 Jan 14 01:31:14.103022 sshd-session[1895]: pam_unix(sshd:session): session closed for user core Jan 14 01:31:14.113094 systemd[1]: sshd@6-10.0.22.183:22-68.220.241.50:45046.service: Deactivated successfully. Jan 14 01:31:14.113113 systemd-logind[1652]: Session 8 logged out. Waiting for processes to exit. Jan 14 01:31:14.115535 systemd[1]: session-8.scope: Deactivated successfully. Jan 14 01:31:14.119871 systemd-logind[1652]: Removed session 8. Jan 14 01:31:14.211465 systemd[1]: Started sshd@7-10.0.22.183:22-68.220.241.50:45052.service - OpenSSH per-connection server daemon (68.220.241.50:45052). Jan 14 01:31:14.373062 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Jan 14 01:31:14.375628 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 01:31:14.468960 chronyd[1637]: Selected source PHC0 Jan 14 01:31:14.469010 chronyd[1637]: System clock wrong by 1.388858 seconds Jan 14 01:31:15.857894 chronyd[1637]: System clock was stepped by 1.388858 seconds Jan 14 01:31:15.858366 systemd-resolved[1343]: Clock change detected. Flushing caches. Jan 14 01:31:16.064135 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 01:31:16.084695 (kubelet)[1918]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 14 01:31:16.137700 kubelet[1918]: E0114 01:31:16.137650 1918 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 14 01:31:16.139641 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 14 01:31:16.139769 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 14 01:31:16.140312 systemd[1]: kubelet.service: Consumed 188ms CPU time, 107.4M memory peak. Jan 14 01:31:16.160911 sshd[1907]: Accepted publickey for core from 68.220.241.50 port 45052 ssh2: RSA SHA256:sbFoGtRFAtOHBsi8EGjpgLe6A0GZZFX7Mtyn33eMzVs Jan 14 01:31:16.162387 sshd-session[1907]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:31:16.167029 systemd-logind[1652]: New session 9 of user core. Jan 14 01:31:16.178043 systemd[1]: Started session-9.scope - Session 9 of User core. Jan 14 01:31:16.373740 sudo[1927]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Jan 14 01:31:16.374575 sudo[1927]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 14 01:31:16.379817 sudo[1927]: pam_unix(sudo:session): session closed for user root Jan 14 01:31:16.395017 sudo[1926]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Jan 14 01:31:16.395789 sudo[1926]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 14 01:31:16.418367 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 14 01:31:16.455000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Jan 14 01:31:16.458145 augenrules[1951]: No rules Jan 14 01:31:16.458877 kernel: kauditd_printk_skb: 74 callbacks suppressed Jan 14 01:31:16.458917 kernel: audit: type=1305 audit(1768354276.455:231): auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Jan 14 01:31:16.455000 audit[1951]: SYSCALL arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7ffc230ad3f0 a2=420 a3=0 items=0 ppid=1932 pid=1951 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:16.460462 systemd[1]: audit-rules.service: Deactivated successfully. Jan 14 01:31:16.460799 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 14 01:31:16.463711 kernel: audit: type=1300 audit(1768354276.455:231): arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7ffc230ad3f0 a2=420 a3=0 items=0 ppid=1932 pid=1951 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:16.463763 kernel: audit: type=1327 audit(1768354276.455:231): proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 14 01:31:16.463778 kernel: audit: type=1106 audit(1768354276.460:232): pid=1926 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 14 01:31:16.455000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 14 01:31:16.460000 audit[1926]: USER_END pid=1926 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 14 01:31:16.462066 sudo[1926]: pam_unix(sudo:session): session closed for user root Jan 14 01:31:16.460000 audit[1926]: CRED_DISP pid=1926 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 14 01:31:16.468908 kernel: audit: type=1104 audit(1768354276.460:233): pid=1926 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 14 01:31:16.469013 kernel: audit: type=1130 audit(1768354276.460:234): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:31:16.460000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:31:16.460000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:31:16.473722 kernel: audit: type=1131 audit(1768354276.460:235): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:31:16.559557 sshd[1925]: Connection closed by 68.220.241.50 port 45052 Jan 14 01:31:16.560471 sshd-session[1907]: pam_unix(sshd:session): session closed for user core Jan 14 01:31:16.560000 audit[1907]: USER_END pid=1907 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:31:16.570868 kernel: audit: type=1106 audit(1768354276.560:236): pid=1907 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:31:16.560000 audit[1907]: CRED_DISP pid=1907 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:31:16.573247 systemd[1]: sshd@7-10.0.22.183:22-68.220.241.50:45052.service: Deactivated successfully. Jan 14 01:31:16.576925 kernel: audit: type=1104 audit(1768354276.560:237): pid=1907 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:31:16.577139 kernel: audit: type=1131 audit(1768354276.571:238): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-10.0.22.183:22-68.220.241.50:45052 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:31:16.571000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-10.0.22.183:22-68.220.241.50:45052 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:31:16.576328 systemd[1]: session-9.scope: Deactivated successfully. Jan 14 01:31:16.578695 systemd-logind[1652]: Session 9 logged out. Waiting for processes to exit. Jan 14 01:31:16.581121 systemd-logind[1652]: Removed session 9. Jan 14 01:31:16.674000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.0.22.183:22-68.220.241.50:45064 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:31:16.676585 systemd[1]: Started sshd@8-10.0.22.183:22-68.220.241.50:45064.service - OpenSSH per-connection server daemon (68.220.241.50:45064). Jan 14 01:31:17.265000 audit[1960]: USER_ACCT pid=1960 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:31:17.268013 sshd[1960]: Accepted publickey for core from 68.220.241.50 port 45064 ssh2: RSA SHA256:sbFoGtRFAtOHBsi8EGjpgLe6A0GZZFX7Mtyn33eMzVs Jan 14 01:31:17.266000 audit[1960]: CRED_ACQ pid=1960 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:31:17.266000 audit[1960]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffffa71b710 a2=3 a3=0 items=0 ppid=1 pid=1960 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=10 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:17.266000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:31:17.269416 sshd-session[1960]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:31:17.276927 systemd-logind[1652]: New session 10 of user core. Jan 14 01:31:17.283058 systemd[1]: Started session-10.scope - Session 10 of User core. Jan 14 01:31:17.285000 audit[1960]: USER_START pid=1960 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:31:17.287000 audit[1964]: CRED_ACQ pid=1964 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:31:17.485000 audit[1965]: USER_ACCT pid=1965 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_unix,pam_faillock acct="core" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 14 01:31:17.488315 sudo[1965]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Jan 14 01:31:17.486000 audit[1965]: CRED_REFR pid=1965 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 14 01:31:17.490339 sudo[1965]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 14 01:31:17.488000 audit[1965]: USER_START pid=1965 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 14 01:31:18.014922 systemd[1]: Starting docker.service - Docker Application Container Engine... Jan 14 01:31:18.029240 (dockerd)[1986]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Jan 14 01:31:18.382323 dockerd[1986]: time="2026-01-14T01:31:18.381978253Z" level=info msg="Starting up" Jan 14 01:31:18.383970 dockerd[1986]: time="2026-01-14T01:31:18.383862131Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Jan 14 01:31:18.394324 dockerd[1986]: time="2026-01-14T01:31:18.394277145Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Jan 14 01:31:18.420190 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport3052286710-merged.mount: Deactivated successfully. Jan 14 01:31:18.456769 dockerd[1986]: time="2026-01-14T01:31:18.456703867Z" level=info msg="Loading containers: start." Jan 14 01:31:18.470878 kernel: Initializing XFRM netlink socket Jan 14 01:31:18.529000 audit[2034]: NETFILTER_CFG table=nat:2 family=2 entries=2 op=nft_register_chain pid=2034 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:31:18.529000 audit[2034]: SYSCALL arch=c000003e syscall=46 success=yes exit=116 a0=3 a1=7ffe63e24620 a2=0 a3=0 items=0 ppid=1986 pid=2034 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:18.529000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Jan 14 01:31:18.531000 audit[2036]: NETFILTER_CFG table=filter:3 family=2 entries=2 op=nft_register_chain pid=2036 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:31:18.531000 audit[2036]: SYSCALL arch=c000003e syscall=46 success=yes exit=124 a0=3 a1=7ffc68e63a20 a2=0 a3=0 items=0 ppid=1986 pid=2036 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:18.531000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Jan 14 01:31:18.533000 audit[2038]: NETFILTER_CFG table=filter:4 family=2 entries=1 op=nft_register_chain pid=2038 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:31:18.533000 audit[2038]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffd189c6730 a2=0 a3=0 items=0 ppid=1986 pid=2038 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:18.533000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Jan 14 01:31:18.535000 audit[2040]: NETFILTER_CFG table=filter:5 family=2 entries=1 op=nft_register_chain pid=2040 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:31:18.535000 audit[2040]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe7dc8a480 a2=0 a3=0 items=0 ppid=1986 pid=2040 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:18.535000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Jan 14 01:31:18.537000 audit[2042]: NETFILTER_CFG table=filter:6 family=2 entries=1 op=nft_register_chain pid=2042 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:31:18.537000 audit[2042]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffe89e62000 a2=0 a3=0 items=0 ppid=1986 pid=2042 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:18.537000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Jan 14 01:31:18.538000 audit[2044]: NETFILTER_CFG table=filter:7 family=2 entries=1 op=nft_register_chain pid=2044 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:31:18.538000 audit[2044]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7ffef78b8230 a2=0 a3=0 items=0 ppid=1986 pid=2044 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:18.538000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 14 01:31:18.540000 audit[2046]: NETFILTER_CFG table=filter:8 family=2 entries=1 op=nft_register_chain pid=2046 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:31:18.540000 audit[2046]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7ffd0dcbe420 a2=0 a3=0 items=0 ppid=1986 pid=2046 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:18.540000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 14 01:31:18.542000 audit[2048]: NETFILTER_CFG table=nat:9 family=2 entries=2 op=nft_register_chain pid=2048 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:31:18.542000 audit[2048]: SYSCALL arch=c000003e syscall=46 success=yes exit=384 a0=3 a1=7ffe516ac7b0 a2=0 a3=0 items=0 ppid=1986 pid=2048 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:18.542000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Jan 14 01:31:18.577000 audit[2051]: NETFILTER_CFG table=nat:10 family=2 entries=2 op=nft_register_chain pid=2051 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:31:18.577000 audit[2051]: SYSCALL arch=c000003e syscall=46 success=yes exit=472 a0=3 a1=7ffdabb178e0 a2=0 a3=0 items=0 ppid=1986 pid=2051 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:18.577000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003132372E302E302E302F38 Jan 14 01:31:18.579000 audit[2053]: NETFILTER_CFG table=filter:11 family=2 entries=2 op=nft_register_chain pid=2053 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:31:18.579000 audit[2053]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7ffebea3c370 a2=0 a3=0 items=0 ppid=1986 pid=2053 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:18.579000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Jan 14 01:31:18.581000 audit[2055]: NETFILTER_CFG table=filter:12 family=2 entries=1 op=nft_register_rule pid=2055 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:31:18.581000 audit[2055]: SYSCALL arch=c000003e syscall=46 success=yes exit=236 a0=3 a1=7ffd22434b90 a2=0 a3=0 items=0 ppid=1986 pid=2055 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:18.581000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Jan 14 01:31:18.583000 audit[2057]: NETFILTER_CFG table=filter:13 family=2 entries=1 op=nft_register_rule pid=2057 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:31:18.583000 audit[2057]: SYSCALL arch=c000003e syscall=46 success=yes exit=248 a0=3 a1=7ffe6fb6ce30 a2=0 a3=0 items=0 ppid=1986 pid=2057 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:18.583000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 14 01:31:18.585000 audit[2059]: NETFILTER_CFG table=filter:14 family=2 entries=1 op=nft_register_rule pid=2059 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:31:18.585000 audit[2059]: SYSCALL arch=c000003e syscall=46 success=yes exit=232 a0=3 a1=7fff67e87430 a2=0 a3=0 items=0 ppid=1986 pid=2059 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:18.585000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Jan 14 01:31:18.622000 audit[2089]: NETFILTER_CFG table=nat:15 family=10 entries=2 op=nft_register_chain pid=2089 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:31:18.622000 audit[2089]: SYSCALL arch=c000003e syscall=46 success=yes exit=116 a0=3 a1=7ffe96076e00 a2=0 a3=0 items=0 ppid=1986 pid=2089 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:18.622000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Jan 14 01:31:18.624000 audit[2091]: NETFILTER_CFG table=filter:16 family=10 entries=2 op=nft_register_chain pid=2091 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:31:18.624000 audit[2091]: SYSCALL arch=c000003e syscall=46 success=yes exit=124 a0=3 a1=7ffd83f16b60 a2=0 a3=0 items=0 ppid=1986 pid=2091 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:18.624000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Jan 14 01:31:18.626000 audit[2093]: NETFILTER_CFG table=filter:17 family=10 entries=1 op=nft_register_chain pid=2093 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:31:18.626000 audit[2093]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe115fb7c0 a2=0 a3=0 items=0 ppid=1986 pid=2093 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:18.626000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Jan 14 01:31:18.628000 audit[2095]: NETFILTER_CFG table=filter:18 family=10 entries=1 op=nft_register_chain pid=2095 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:31:18.628000 audit[2095]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffd4a89e070 a2=0 a3=0 items=0 ppid=1986 pid=2095 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:18.628000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Jan 14 01:31:18.630000 audit[2097]: NETFILTER_CFG table=filter:19 family=10 entries=1 op=nft_register_chain pid=2097 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:31:18.630000 audit[2097]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7fff2ffa6640 a2=0 a3=0 items=0 ppid=1986 pid=2097 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:18.630000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Jan 14 01:31:18.631000 audit[2099]: NETFILTER_CFG table=filter:20 family=10 entries=1 op=nft_register_chain pid=2099 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:31:18.631000 audit[2099]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7fff489094c0 a2=0 a3=0 items=0 ppid=1986 pid=2099 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:18.631000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 14 01:31:18.633000 audit[2101]: NETFILTER_CFG table=filter:21 family=10 entries=1 op=nft_register_chain pid=2101 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:31:18.633000 audit[2101]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7ffd88c74a00 a2=0 a3=0 items=0 ppid=1986 pid=2101 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:18.633000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 14 01:31:18.635000 audit[2103]: NETFILTER_CFG table=nat:22 family=10 entries=2 op=nft_register_chain pid=2103 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:31:18.635000 audit[2103]: SYSCALL arch=c000003e syscall=46 success=yes exit=384 a0=3 a1=7ffccdf14e10 a2=0 a3=0 items=0 ppid=1986 pid=2103 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:18.635000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Jan 14 01:31:18.637000 audit[2105]: NETFILTER_CFG table=nat:23 family=10 entries=2 op=nft_register_chain pid=2105 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:31:18.637000 audit[2105]: SYSCALL arch=c000003e syscall=46 success=yes exit=484 a0=3 a1=7ffd06b9dbf0 a2=0 a3=0 items=0 ppid=1986 pid=2105 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:18.637000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003A3A312F313238 Jan 14 01:31:18.639000 audit[2107]: NETFILTER_CFG table=filter:24 family=10 entries=2 op=nft_register_chain pid=2107 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:31:18.639000 audit[2107]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7ffd5370fa80 a2=0 a3=0 items=0 ppid=1986 pid=2107 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:18.639000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Jan 14 01:31:18.641000 audit[2109]: NETFILTER_CFG table=filter:25 family=10 entries=1 op=nft_register_rule pid=2109 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:31:18.641000 audit[2109]: SYSCALL arch=c000003e syscall=46 success=yes exit=236 a0=3 a1=7fff46f668f0 a2=0 a3=0 items=0 ppid=1986 pid=2109 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:18.641000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Jan 14 01:31:18.643000 audit[2111]: NETFILTER_CFG table=filter:26 family=10 entries=1 op=nft_register_rule pid=2111 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:31:18.643000 audit[2111]: SYSCALL arch=c000003e syscall=46 success=yes exit=248 a0=3 a1=7fff57b162f0 a2=0 a3=0 items=0 ppid=1986 pid=2111 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:18.643000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 14 01:31:18.645000 audit[2113]: NETFILTER_CFG table=filter:27 family=10 entries=1 op=nft_register_rule pid=2113 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:31:18.645000 audit[2113]: SYSCALL arch=c000003e syscall=46 success=yes exit=232 a0=3 a1=7fffbe9f6710 a2=0 a3=0 items=0 ppid=1986 pid=2113 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:18.645000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Jan 14 01:31:18.649000 audit[2118]: NETFILTER_CFG table=filter:28 family=2 entries=1 op=nft_register_chain pid=2118 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:31:18.649000 audit[2118]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffd3ed0d470 a2=0 a3=0 items=0 ppid=1986 pid=2118 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:18.649000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Jan 14 01:31:18.651000 audit[2120]: NETFILTER_CFG table=filter:29 family=2 entries=1 op=nft_register_rule pid=2120 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:31:18.651000 audit[2120]: SYSCALL arch=c000003e syscall=46 success=yes exit=212 a0=3 a1=7ffee92eae70 a2=0 a3=0 items=0 ppid=1986 pid=2120 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:18.651000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Jan 14 01:31:18.653000 audit[2122]: NETFILTER_CFG table=filter:30 family=2 entries=1 op=nft_register_rule pid=2122 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:31:18.653000 audit[2122]: SYSCALL arch=c000003e syscall=46 success=yes exit=224 a0=3 a1=7ffe89152280 a2=0 a3=0 items=0 ppid=1986 pid=2122 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:18.653000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Jan 14 01:31:18.655000 audit[2124]: NETFILTER_CFG table=filter:31 family=10 entries=1 op=nft_register_chain pid=2124 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:31:18.655000 audit[2124]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffe2b18e190 a2=0 a3=0 items=0 ppid=1986 pid=2124 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:18.655000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Jan 14 01:31:18.657000 audit[2126]: NETFILTER_CFG table=filter:32 family=10 entries=1 op=nft_register_rule pid=2126 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:31:18.657000 audit[2126]: SYSCALL arch=c000003e syscall=46 success=yes exit=212 a0=3 a1=7ffe053be460 a2=0 a3=0 items=0 ppid=1986 pid=2126 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:18.657000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Jan 14 01:31:18.659000 audit[2128]: NETFILTER_CFG table=filter:33 family=10 entries=1 op=nft_register_rule pid=2128 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:31:18.659000 audit[2128]: SYSCALL arch=c000003e syscall=46 success=yes exit=224 a0=3 a1=7ffcc6750c90 a2=0 a3=0 items=0 ppid=1986 pid=2128 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:18.659000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Jan 14 01:31:18.685000 audit[2133]: NETFILTER_CFG table=nat:34 family=2 entries=2 op=nft_register_chain pid=2133 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:31:18.685000 audit[2133]: SYSCALL arch=c000003e syscall=46 success=yes exit=520 a0=3 a1=7ffce26be370 a2=0 a3=0 items=0 ppid=1986 pid=2133 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:18.685000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900504F5354524F5554494E47002D73003137322E31372E302E302F31360000002D6F00646F636B657230002D6A004D415351554552414445 Jan 14 01:31:18.687000 audit[2135]: NETFILTER_CFG table=nat:35 family=2 entries=1 op=nft_register_rule pid=2135 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:31:18.687000 audit[2135]: SYSCALL arch=c000003e syscall=46 success=yes exit=288 a0=3 a1=7ffdf00d98f0 a2=0 a3=0 items=0 ppid=1986 pid=2135 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:18.687000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900444F434B4552002D6900646F636B657230002D6A0052455455524E Jan 14 01:31:18.695000 audit[2143]: NETFILTER_CFG table=filter:36 family=2 entries=1 op=nft_register_rule pid=2143 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:31:18.695000 audit[2143]: SYSCALL arch=c000003e syscall=46 success=yes exit=300 a0=3 a1=7fff82b77d50 a2=0 a3=0 items=0 ppid=1986 pid=2143 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:18.695000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D464F5257415244002D6900646F636B657230002D6A00414343455054 Jan 14 01:31:18.706000 audit[2149]: NETFILTER_CFG table=filter:37 family=2 entries=1 op=nft_register_rule pid=2149 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:31:18.706000 audit[2149]: SYSCALL arch=c000003e syscall=46 success=yes exit=376 a0=3 a1=7fffd84b6250 a2=0 a3=0 items=0 ppid=1986 pid=2149 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:18.706000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45520000002D6900646F636B657230002D6F00646F636B657230002D6A0044524F50 Jan 14 01:31:18.708000 audit[2151]: NETFILTER_CFG table=filter:38 family=2 entries=1 op=nft_register_rule pid=2151 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:31:18.708000 audit[2151]: SYSCALL arch=c000003e syscall=46 success=yes exit=512 a0=3 a1=7fff85179cc0 a2=0 a3=0 items=0 ppid=1986 pid=2151 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:18.708000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D4354002D6F00646F636B657230002D6D00636F6E6E747261636B002D2D637473746174650052454C415445442C45535441424C4953484544002D6A00414343455054 Jan 14 01:31:18.710000 audit[2153]: NETFILTER_CFG table=filter:39 family=2 entries=1 op=nft_register_rule pid=2153 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:31:18.710000 audit[2153]: SYSCALL arch=c000003e syscall=46 success=yes exit=312 a0=3 a1=7ffc3f95ea30 a2=0 a3=0 items=0 ppid=1986 pid=2153 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:18.710000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D425249444745002D6F00646F636B657230002D6A00444F434B4552 Jan 14 01:31:18.712000 audit[2155]: NETFILTER_CFG table=filter:40 family=2 entries=1 op=nft_register_rule pid=2155 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:31:18.712000 audit[2155]: SYSCALL arch=c000003e syscall=46 success=yes exit=428 a0=3 a1=7ffd48948620 a2=0 a3=0 items=0 ppid=1986 pid=2155 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:18.712000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D49534F4C4154494F4E2D53544147452D31002D6900646F636B6572300000002D6F00646F636B657230002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 14 01:31:18.714000 audit[2157]: NETFILTER_CFG table=filter:41 family=2 entries=1 op=nft_register_rule pid=2157 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:31:18.714000 audit[2157]: SYSCALL arch=c000003e syscall=46 success=yes exit=312 a0=3 a1=7ffd0ba822f0 a2=0 a3=0 items=0 ppid=1986 pid=2157 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:18.714000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4900444F434B45522D49534F4C4154494F4E2D53544147452D32002D6F00646F636B657230002D6A0044524F50 Jan 14 01:31:18.717167 systemd-networkd[1567]: docker0: Link UP Jan 14 01:31:18.725577 dockerd[1986]: time="2026-01-14T01:31:18.725533906Z" level=info msg="Loading containers: done." Jan 14 01:31:18.760801 dockerd[1986]: time="2026-01-14T01:31:18.760732387Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Jan 14 01:31:18.761016 dockerd[1986]: time="2026-01-14T01:31:18.760838742Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Jan 14 01:31:18.761016 dockerd[1986]: time="2026-01-14T01:31:18.760937630Z" level=info msg="Initializing buildkit" Jan 14 01:31:18.787593 dockerd[1986]: time="2026-01-14T01:31:18.787555343Z" level=info msg="Completed buildkit initialization" Jan 14 01:31:18.794912 dockerd[1986]: time="2026-01-14T01:31:18.793870376Z" level=info msg="Daemon has completed initialization" Jan 14 01:31:18.794912 dockerd[1986]: time="2026-01-14T01:31:18.793921681Z" level=info msg="API listen on /run/docker.sock" Jan 14 01:31:18.793000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=docker comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:31:18.794771 systemd[1]: Started docker.service - Docker Application Container Engine. Jan 14 01:31:20.164687 containerd[1684]: time="2026-01-14T01:31:20.164613115Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.11\"" Jan 14 01:31:20.965678 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3004313022.mount: Deactivated successfully. Jan 14 01:31:21.841878 containerd[1684]: time="2026-01-14T01:31:21.841089450Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:31:21.842570 containerd[1684]: time="2026-01-14T01:31:21.842398158Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.11: active requests=0, bytes read=27401903" Jan 14 01:31:21.843936 containerd[1684]: time="2026-01-14T01:31:21.843909394Z" level=info msg="ImageCreate event name:\"sha256:7757c58248a29fc7474a8072796848689852b0477adf16765f38b3d1a9bacadf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:31:21.846711 containerd[1684]: time="2026-01-14T01:31:21.846688315Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:41eaecaed9af0ca8ab36d7794819c7df199e68c6c6ee0649114d713c495f8bd5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:31:21.847833 containerd[1684]: time="2026-01-14T01:31:21.847292591Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.11\" with image id \"sha256:7757c58248a29fc7474a8072796848689852b0477adf16765f38b3d1a9bacadf\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.11\", repo digest \"registry.k8s.io/kube-apiserver@sha256:41eaecaed9af0ca8ab36d7794819c7df199e68c6c6ee0649114d713c495f8bd5\", size \"29067246\" in 1.682626603s" Jan 14 01:31:21.847833 containerd[1684]: time="2026-01-14T01:31:21.847321735Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.11\" returns image reference \"sha256:7757c58248a29fc7474a8072796848689852b0477adf16765f38b3d1a9bacadf\"" Jan 14 01:31:21.848051 containerd[1684]: time="2026-01-14T01:31:21.848007362Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.11\"" Jan 14 01:31:23.265020 containerd[1684]: time="2026-01-14T01:31:23.264934562Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:31:23.267464 containerd[1684]: time="2026-01-14T01:31:23.267428799Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.11: active requests=0, bytes read=24985199" Jan 14 01:31:23.269789 containerd[1684]: time="2026-01-14T01:31:23.269737997Z" level=info msg="ImageCreate event name:\"sha256:0175d0a8243db520e3caa6d5c1e4248fddbc32447a9e8b5f4630831bc1e2489e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:31:23.273922 containerd[1684]: time="2026-01-14T01:31:23.273862833Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:ce7b2ead5eef1a1554ef28b2b79596c6a8c6d506a87a7ab1381e77fe3d72f55f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:31:23.275622 containerd[1684]: time="2026-01-14T01:31:23.274926069Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.11\" with image id \"sha256:0175d0a8243db520e3caa6d5c1e4248fddbc32447a9e8b5f4630831bc1e2489e\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.11\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:ce7b2ead5eef1a1554ef28b2b79596c6a8c6d506a87a7ab1381e77fe3d72f55f\", size \"26650388\" in 1.426826833s" Jan 14 01:31:23.275622 containerd[1684]: time="2026-01-14T01:31:23.274978571Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.11\" returns image reference \"sha256:0175d0a8243db520e3caa6d5c1e4248fddbc32447a9e8b5f4630831bc1e2489e\"" Jan 14 01:31:23.275857 containerd[1684]: time="2026-01-14T01:31:23.275812067Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.11\"" Jan 14 01:31:24.508977 containerd[1684]: time="2026-01-14T01:31:24.508925739Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:31:24.510233 containerd[1684]: time="2026-01-14T01:31:24.510214970Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.11: active requests=0, bytes read=19396939" Jan 14 01:31:24.512508 containerd[1684]: time="2026-01-14T01:31:24.512477090Z" level=info msg="ImageCreate event name:\"sha256:23d6a1fb92fda53b787f364351c610e55f073e8bdf0de5831974df7875b13f21\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:31:24.515180 containerd[1684]: time="2026-01-14T01:31:24.515142368Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:b3039587bbe70e61a6aeaff56c21fdeeef104524a31f835bcc80887d40b8e6b2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:31:24.516012 containerd[1684]: time="2026-01-14T01:31:24.515766036Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.11\" with image id \"sha256:23d6a1fb92fda53b787f364351c610e55f073e8bdf0de5831974df7875b13f21\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.11\", repo digest \"registry.k8s.io/kube-scheduler@sha256:b3039587bbe70e61a6aeaff56c21fdeeef104524a31f835bcc80887d40b8e6b2\", size \"21062128\" in 1.239934564s" Jan 14 01:31:24.516012 containerd[1684]: time="2026-01-14T01:31:24.515800939Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.11\" returns image reference \"sha256:23d6a1fb92fda53b787f364351c610e55f073e8bdf0de5831974df7875b13f21\"" Jan 14 01:31:24.516191 containerd[1684]: time="2026-01-14T01:31:24.516181647Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.11\"" Jan 14 01:31:25.620316 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3534700502.mount: Deactivated successfully. Jan 14 01:31:25.976944 containerd[1684]: time="2026-01-14T01:31:25.976394210Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:31:25.978335 containerd[1684]: time="2026-01-14T01:31:25.978315968Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.11: active requests=0, bytes read=31158177" Jan 14 01:31:25.980070 containerd[1684]: time="2026-01-14T01:31:25.980048070Z" level=info msg="ImageCreate event name:\"sha256:4d8fb2dc5751966f058943ff7c5f10551e603d726ab8648c7c7b7f95a2663e3d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:31:25.983137 containerd[1684]: time="2026-01-14T01:31:25.983095115Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:4204f9136c23a867929d32046032fe069b49ad94cf168042405e7d0ec88bdba9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:31:25.983961 containerd[1684]: time="2026-01-14T01:31:25.983529086Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.11\" with image id \"sha256:4d8fb2dc5751966f058943ff7c5f10551e603d726ab8648c7c7b7f95a2663e3d\", repo tag \"registry.k8s.io/kube-proxy:v1.32.11\", repo digest \"registry.k8s.io/kube-proxy@sha256:4204f9136c23a867929d32046032fe069b49ad94cf168042405e7d0ec88bdba9\", size \"31160918\" in 1.46732668s" Jan 14 01:31:25.983961 containerd[1684]: time="2026-01-14T01:31:25.983558346Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.11\" returns image reference \"sha256:4d8fb2dc5751966f058943ff7c5f10551e603d726ab8648c7c7b7f95a2663e3d\"" Jan 14 01:31:25.984197 containerd[1684]: time="2026-01-14T01:31:25.984180560Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Jan 14 01:31:26.262218 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Jan 14 01:31:26.266831 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 01:31:26.465866 kernel: kauditd_printk_skb: 132 callbacks suppressed Jan 14 01:31:26.465994 kernel: audit: type=1130 audit(1768354286.461:289): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:31:26.461000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:31:26.462322 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 01:31:26.480351 (kubelet)[2280]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 14 01:31:26.903268 kubelet[2280]: E0114 01:31:26.903205 2280 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 14 01:31:26.906749 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 14 01:31:26.907264 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 14 01:31:26.907000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 14 01:31:26.908157 systemd[1]: kubelet.service: Consumed 199ms CPU time, 110.3M memory peak. Jan 14 01:31:26.911909 kernel: audit: type=1131 audit(1768354286.907:290): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 14 01:31:27.230788 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3239159885.mount: Deactivated successfully. Jan 14 01:31:29.420386 containerd[1684]: time="2026-01-14T01:31:29.420318645Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:31:29.423066 containerd[1684]: time="2026-01-14T01:31:29.423021952Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18470356" Jan 14 01:31:29.424935 containerd[1684]: time="2026-01-14T01:31:29.424892148Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:31:29.441124 containerd[1684]: time="2026-01-14T01:31:29.441029143Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:31:29.441888 containerd[1684]: time="2026-01-14T01:31:29.441534018Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 3.457074768s" Jan 14 01:31:29.441888 containerd[1684]: time="2026-01-14T01:31:29.441577035Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Jan 14 01:31:29.442204 containerd[1684]: time="2026-01-14T01:31:29.442182469Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Jan 14 01:31:30.082175 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1022627681.mount: Deactivated successfully. Jan 14 01:31:30.095001 containerd[1684]: time="2026-01-14T01:31:30.094930139Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 14 01:31:30.098090 containerd[1684]: time="2026-01-14T01:31:30.098025874Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Jan 14 01:31:30.100090 containerd[1684]: time="2026-01-14T01:31:30.099976165Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 14 01:31:30.104898 containerd[1684]: time="2026-01-14T01:31:30.104716744Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 14 01:31:30.107761 containerd[1684]: time="2026-01-14T01:31:30.106758409Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 664.539188ms" Jan 14 01:31:30.107761 containerd[1684]: time="2026-01-14T01:31:30.106827411Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Jan 14 01:31:30.108170 containerd[1684]: time="2026-01-14T01:31:30.108087531Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Jan 14 01:31:30.747974 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4125158333.mount: Deactivated successfully. Jan 14 01:31:33.172700 containerd[1684]: time="2026-01-14T01:31:33.172621677Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:31:33.176571 containerd[1684]: time="2026-01-14T01:31:33.176199508Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=45502580" Jan 14 01:31:33.177986 containerd[1684]: time="2026-01-14T01:31:33.177946576Z" level=info msg="ImageCreate event name:\"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:31:33.182614 containerd[1684]: time="2026-01-14T01:31:33.182578343Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:31:33.183950 containerd[1684]: time="2026-01-14T01:31:33.183917260Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"57680541\" in 3.075775615s" Jan 14 01:31:33.184048 containerd[1684]: time="2026-01-14T01:31:33.184032589Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\"" Jan 14 01:31:37.011905 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Jan 14 01:31:37.015190 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 01:31:37.141697 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 01:31:37.145897 kernel: audit: type=1130 audit(1768354297.141:291): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:31:37.141000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:31:37.149255 (kubelet)[2432]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 14 01:31:37.194582 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 01:31:37.364173 update_engine[1653]: I20260114 01:31:37.363890 1653 update_attempter.cc:509] Updating boot flags... Jan 14 01:31:37.638315 systemd[1]: kubelet.service: Deactivated successfully. Jan 14 01:31:37.638813 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 01:31:37.638000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:31:37.639946 systemd[1]: kubelet.service: Consumed 128ms CPU time, 108M memory peak. Jan 14 01:31:37.642869 kernel: audit: type=1131 audit(1768354297.638:292): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:31:37.643721 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 01:31:37.697744 systemd[1]: Reload requested from client PID 2460 ('systemctl') (unit session-10.scope)... Jan 14 01:31:37.697761 systemd[1]: Reloading... Jan 14 01:31:37.823888 zram_generator::config[2520]: No configuration found. Jan 14 01:31:38.082534 systemd[1]: Reloading finished in 384 ms. Jan 14 01:31:38.124793 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Jan 14 01:31:38.124886 systemd[1]: kubelet.service: Failed with result 'signal'. Jan 14 01:31:38.125161 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 01:31:38.124000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 14 01:31:38.128910 kernel: audit: type=1130 audit(1768354298.124:293): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 14 01:31:38.132275 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 01:31:38.133000 audit: BPF prog-id=63 op=LOAD Jan 14 01:31:38.136939 kernel: audit: type=1334 audit(1768354298.133:294): prog-id=63 op=LOAD Jan 14 01:31:38.136988 kernel: audit: type=1334 audit(1768354298.133:295): prog-id=50 op=UNLOAD Jan 14 01:31:38.133000 audit: BPF prog-id=50 op=UNLOAD Jan 14 01:31:38.133000 audit: BPF prog-id=64 op=LOAD Jan 14 01:31:38.140857 kernel: audit: type=1334 audit(1768354298.133:296): prog-id=64 op=LOAD Jan 14 01:31:38.133000 audit: BPF prog-id=65 op=LOAD Jan 14 01:31:38.133000 audit: BPF prog-id=51 op=UNLOAD Jan 14 01:31:38.144465 kernel: audit: type=1334 audit(1768354298.133:297): prog-id=65 op=LOAD Jan 14 01:31:38.144528 kernel: audit: type=1334 audit(1768354298.133:298): prog-id=51 op=UNLOAD Jan 14 01:31:38.144549 kernel: audit: type=1334 audit(1768354298.133:299): prog-id=52 op=UNLOAD Jan 14 01:31:38.133000 audit: BPF prog-id=52 op=UNLOAD Jan 14 01:31:38.133000 audit: BPF prog-id=66 op=LOAD Jan 14 01:31:38.146238 kernel: audit: type=1334 audit(1768354298.133:300): prog-id=66 op=LOAD Jan 14 01:31:38.133000 audit: BPF prog-id=46 op=UNLOAD Jan 14 01:31:38.137000 audit: BPF prog-id=67 op=LOAD Jan 14 01:31:38.137000 audit: BPF prog-id=47 op=UNLOAD Jan 14 01:31:38.137000 audit: BPF prog-id=68 op=LOAD Jan 14 01:31:38.137000 audit: BPF prog-id=69 op=LOAD Jan 14 01:31:38.137000 audit: BPF prog-id=48 op=UNLOAD Jan 14 01:31:38.137000 audit: BPF prog-id=49 op=UNLOAD Jan 14 01:31:38.140000 audit: BPF prog-id=70 op=LOAD Jan 14 01:31:38.140000 audit: BPF prog-id=53 op=UNLOAD Jan 14 01:31:38.140000 audit: BPF prog-id=71 op=LOAD Jan 14 01:31:38.140000 audit: BPF prog-id=72 op=LOAD Jan 14 01:31:38.140000 audit: BPF prog-id=54 op=UNLOAD Jan 14 01:31:38.140000 audit: BPF prog-id=55 op=UNLOAD Jan 14 01:31:38.140000 audit: BPF prog-id=73 op=LOAD Jan 14 01:31:38.141000 audit: BPF prog-id=74 op=LOAD Jan 14 01:31:38.141000 audit: BPF prog-id=44 op=UNLOAD Jan 14 01:31:38.141000 audit: BPF prog-id=45 op=UNLOAD Jan 14 01:31:38.147000 audit: BPF prog-id=75 op=LOAD Jan 14 01:31:38.147000 audit: BPF prog-id=43 op=UNLOAD Jan 14 01:31:38.149000 audit: BPF prog-id=76 op=LOAD Jan 14 01:31:38.149000 audit: BPF prog-id=59 op=UNLOAD Jan 14 01:31:38.150000 audit: BPF prog-id=77 op=LOAD Jan 14 01:31:38.150000 audit: BPF prog-id=60 op=UNLOAD Jan 14 01:31:38.150000 audit: BPF prog-id=78 op=LOAD Jan 14 01:31:38.150000 audit: BPF prog-id=79 op=LOAD Jan 14 01:31:38.151000 audit: BPF prog-id=61 op=UNLOAD Jan 14 01:31:38.151000 audit: BPF prog-id=62 op=UNLOAD Jan 14 01:31:38.151000 audit: BPF prog-id=80 op=LOAD Jan 14 01:31:38.151000 audit: BPF prog-id=56 op=UNLOAD Jan 14 01:31:38.151000 audit: BPF prog-id=81 op=LOAD Jan 14 01:31:38.151000 audit: BPF prog-id=82 op=LOAD Jan 14 01:31:38.151000 audit: BPF prog-id=57 op=UNLOAD Jan 14 01:31:38.151000 audit: BPF prog-id=58 op=UNLOAD Jan 14 01:31:38.879597 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 01:31:38.879000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:31:38.889106 (kubelet)[2561]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 14 01:31:38.926082 kubelet[2561]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 14 01:31:38.926082 kubelet[2561]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jan 14 01:31:38.926082 kubelet[2561]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 14 01:31:38.926428 kubelet[2561]: I0114 01:31:38.926126 2561 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 14 01:31:39.226935 kubelet[2561]: I0114 01:31:39.224382 2561 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Jan 14 01:31:39.226935 kubelet[2561]: I0114 01:31:39.224892 2561 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 14 01:31:39.226935 kubelet[2561]: I0114 01:31:39.225412 2561 server.go:954] "Client rotation is on, will bootstrap in background" Jan 14 01:31:40.173173 kubelet[2561]: E0114 01:31:40.173091 2561 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.0.22.183:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.22.183:6443: connect: connection refused" logger="UnhandledError" Jan 14 01:31:40.175291 kubelet[2561]: I0114 01:31:40.175067 2561 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 14 01:31:40.213407 kubelet[2561]: I0114 01:31:40.213150 2561 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 14 01:31:40.221816 kubelet[2561]: I0114 01:31:40.221193 2561 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 14 01:31:40.246494 kubelet[2561]: I0114 01:31:40.245978 2561 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 14 01:31:40.246494 kubelet[2561]: I0114 01:31:40.246076 2561 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4578-0-0-p-557efd55ff","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 14 01:31:40.246494 kubelet[2561]: I0114 01:31:40.246374 2561 topology_manager.go:138] "Creating topology manager with none policy" Jan 14 01:31:40.246494 kubelet[2561]: I0114 01:31:40.246392 2561 container_manager_linux.go:304] "Creating device plugin manager" Jan 14 01:31:40.246888 kubelet[2561]: I0114 01:31:40.246603 2561 state_mem.go:36] "Initialized new in-memory state store" Jan 14 01:31:40.446222 kubelet[2561]: I0114 01:31:40.445266 2561 kubelet.go:446] "Attempting to sync node with API server" Jan 14 01:31:40.446222 kubelet[2561]: I0114 01:31:40.445363 2561 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 14 01:31:40.446222 kubelet[2561]: I0114 01:31:40.445425 2561 kubelet.go:352] "Adding apiserver pod source" Jan 14 01:31:40.446222 kubelet[2561]: I0114 01:31:40.445450 2561 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 14 01:31:40.485568 kubelet[2561]: W0114 01:31:40.485513 2561 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.22.183:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4578-0-0-p-557efd55ff&limit=500&resourceVersion=0": dial tcp 10.0.22.183:6443: connect: connection refused Jan 14 01:31:40.485742 kubelet[2561]: E0114 01:31:40.485618 2561 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.22.183:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4578-0-0-p-557efd55ff&limit=500&resourceVersion=0\": dial tcp 10.0.22.183:6443: connect: connection refused" logger="UnhandledError" Jan 14 01:31:40.485858 kubelet[2561]: I0114 01:31:40.485810 2561 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Jan 14 01:31:40.486816 kubelet[2561]: I0114 01:31:40.486774 2561 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 14 01:31:40.486969 kubelet[2561]: W0114 01:31:40.486942 2561 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Jan 14 01:31:40.492928 kubelet[2561]: I0114 01:31:40.492454 2561 watchdog_linux.go:99] "Systemd watchdog is not enabled" Jan 14 01:31:40.492928 kubelet[2561]: I0114 01:31:40.492519 2561 server.go:1287] "Started kubelet" Jan 14 01:31:40.497316 kubelet[2561]: W0114 01:31:40.497247 2561 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.22.183:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.22.183:6443: connect: connection refused Jan 14 01:31:40.497316 kubelet[2561]: E0114 01:31:40.497308 2561 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.22.183:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.22.183:6443: connect: connection refused" logger="UnhandledError" Jan 14 01:31:40.497550 kubelet[2561]: I0114 01:31:40.497348 2561 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Jan 14 01:31:40.498277 kubelet[2561]: I0114 01:31:40.498228 2561 server.go:479] "Adding debug handlers to kubelet server" Jan 14 01:31:40.500436 kubelet[2561]: I0114 01:31:40.500390 2561 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 14 01:31:40.525423 kubelet[2561]: I0114 01:31:40.523472 2561 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 14 01:31:40.526678 kubelet[2561]: I0114 01:31:40.526598 2561 volume_manager.go:297] "Starting Kubelet Volume Manager" Jan 14 01:31:40.526868 kubelet[2561]: E0114 01:31:40.526839 2561 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4578-0-0-p-557efd55ff\" not found" Jan 14 01:31:40.528105 kubelet[2561]: I0114 01:31:40.527398 2561 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Jan 14 01:31:40.528105 kubelet[2561]: I0114 01:31:40.527445 2561 reconciler.go:26] "Reconciler: start to sync state" Jan 14 01:31:40.529000 audit[2572]: NETFILTER_CFG table=mangle:42 family=2 entries=2 op=nft_register_chain pid=2572 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:31:40.529000 audit[2572]: SYSCALL arch=c000003e syscall=46 success=yes exit=136 a0=3 a1=7ffe21d87b70 a2=0 a3=0 items=0 ppid=2561 pid=2572 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:40.529000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Jan 14 01:31:40.530000 audit[2573]: NETFILTER_CFG table=filter:43 family=2 entries=1 op=nft_register_chain pid=2573 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:31:40.530000 audit[2573]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffcc0b8f130 a2=0 a3=0 items=0 ppid=2561 pid=2573 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:40.530000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Jan 14 01:31:40.532099 kubelet[2561]: I0114 01:31:40.531343 2561 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 14 01:31:40.532099 kubelet[2561]: I0114 01:31:40.531631 2561 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 14 01:31:40.532000 audit[2575]: NETFILTER_CFG table=filter:44 family=2 entries=2 op=nft_register_chain pid=2575 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:31:40.532000 audit[2575]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7fffb5318860 a2=0 a3=0 items=0 ppid=2561 pid=2575 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:40.532000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 14 01:31:40.534000 audit[2577]: NETFILTER_CFG table=filter:45 family=2 entries=2 op=nft_register_chain pid=2577 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:31:40.534000 audit[2577]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7ffee77be4c0 a2=0 a3=0 items=0 ppid=2561 pid=2577 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:40.534000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 14 01:31:40.543000 audit[2580]: NETFILTER_CFG table=filter:46 family=2 entries=1 op=nft_register_rule pid=2580 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:31:40.543000 audit[2580]: SYSCALL arch=c000003e syscall=46 success=yes exit=924 a0=3 a1=7ffe7fcd87f0 a2=0 a3=0 items=0 ppid=2561 pid=2580 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:40.545724 kubelet[2561]: E0114 01:31:40.533978 2561 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.22.183:6443/api/v1/namespaces/default/events\": dial tcp 10.0.22.183:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4578-0-0-p-557efd55ff.188a74ddfde38c8b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4578-0-0-p-557efd55ff,UID:ci-4578-0-0-p-557efd55ff,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4578-0-0-p-557efd55ff,},FirstTimestamp:2026-01-14 01:31:40.492483723 +0000 UTC m=+1.599598872,LastTimestamp:2026-01-14 01:31:40.492483723 +0000 UTC m=+1.599598872,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4578-0-0-p-557efd55ff,}" Jan 14 01:31:40.543000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D41004B5542452D4649524557414C4C002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E7400626C6F636B20696E636F6D696E67206C6F63616C6E657420636F6E6E656374696F6E73002D2D647374003132372E302E302E302F38 Jan 14 01:31:40.545946 kubelet[2561]: I0114 01:31:40.545905 2561 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 14 01:31:40.546000 audit[2581]: NETFILTER_CFG table=mangle:47 family=10 entries=2 op=nft_register_chain pid=2581 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:31:40.546000 audit[2581]: SYSCALL arch=c000003e syscall=46 success=yes exit=136 a0=3 a1=7ffd47d9d450 a2=0 a3=0 items=0 ppid=2561 pid=2581 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:40.546000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Jan 14 01:31:40.547755 kubelet[2561]: I0114 01:31:40.547723 2561 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 14 01:31:40.547755 kubelet[2561]: I0114 01:31:40.547748 2561 status_manager.go:227] "Starting to sync pod status with apiserver" Jan 14 01:31:40.547831 kubelet[2561]: I0114 01:31:40.547768 2561 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jan 14 01:31:40.547831 kubelet[2561]: I0114 01:31:40.547776 2561 kubelet.go:2382] "Starting kubelet main sync loop" Jan 14 01:31:40.547831 kubelet[2561]: E0114 01:31:40.547816 2561 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 14 01:31:40.548062 kubelet[2561]: W0114 01:31:40.548016 2561 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.22.183:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.22.183:6443: connect: connection refused Jan 14 01:31:40.548096 kubelet[2561]: E0114 01:31:40.548068 2561 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.22.183:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.22.183:6443: connect: connection refused" logger="UnhandledError" Jan 14 01:31:40.548140 kubelet[2561]: E0114 01:31:40.548122 2561 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.22.183:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4578-0-0-p-557efd55ff?timeout=10s\": dial tcp 10.0.22.183:6443: connect: connection refused" interval="200ms" Jan 14 01:31:40.548330 kubelet[2561]: I0114 01:31:40.548299 2561 factory.go:221] Registration of the systemd container factory successfully Jan 14 01:31:40.548530 kubelet[2561]: I0114 01:31:40.548366 2561 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 14 01:31:40.549000 audit[2582]: NETFILTER_CFG table=mangle:48 family=2 entries=1 op=nft_register_chain pid=2582 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:31:40.549000 audit[2582]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7fffc2779f00 a2=0 a3=0 items=0 ppid=2561 pid=2582 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:40.549000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Jan 14 01:31:40.550000 audit[2583]: NETFILTER_CFG table=mangle:49 family=10 entries=1 op=nft_register_chain pid=2583 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:31:40.550000 audit[2583]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7fff4668e9c0 a2=0 a3=0 items=0 ppid=2561 pid=2583 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:40.550000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Jan 14 01:31:40.551000 audit[2584]: NETFILTER_CFG table=nat:50 family=10 entries=1 op=nft_register_chain pid=2584 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:31:40.551000 audit[2584]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe1eba3a70 a2=0 a3=0 items=0 ppid=2561 pid=2584 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:40.551000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Jan 14 01:31:40.552000 audit[2585]: NETFILTER_CFG table=filter:51 family=10 entries=1 op=nft_register_chain pid=2585 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:31:40.552000 audit[2585]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffc1dbbe4f0 a2=0 a3=0 items=0 ppid=2561 pid=2585 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:40.552000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Jan 14 01:31:40.554988 kubelet[2561]: W0114 01:31:40.554936 2561 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.22.183:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.22.183:6443: connect: connection refused Jan 14 01:31:40.555109 kubelet[2561]: E0114 01:31:40.554991 2561 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.22.183:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.22.183:6443: connect: connection refused" logger="UnhandledError" Jan 14 01:31:40.556117 kubelet[2561]: I0114 01:31:40.555731 2561 factory.go:221] Registration of the containerd container factory successfully Jan 14 01:31:40.555000 audit[2586]: NETFILTER_CFG table=nat:52 family=2 entries=1 op=nft_register_chain pid=2586 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:31:40.555000 audit[2586]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc887f05f0 a2=0 a3=0 items=0 ppid=2561 pid=2586 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:40.555000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Jan 14 01:31:40.560000 audit[2587]: NETFILTER_CFG table=filter:53 family=2 entries=1 op=nft_register_chain pid=2587 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:31:40.560000 audit[2587]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffc7ce9d9e0 a2=0 a3=0 items=0 ppid=2561 pid=2587 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:40.560000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Jan 14 01:31:40.563827 kubelet[2561]: E0114 01:31:40.563782 2561 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 14 01:31:40.571874 kubelet[2561]: I0114 01:31:40.571830 2561 cpu_manager.go:221] "Starting CPU manager" policy="none" Jan 14 01:31:40.571874 kubelet[2561]: I0114 01:31:40.571855 2561 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jan 14 01:31:40.571874 kubelet[2561]: I0114 01:31:40.571873 2561 state_mem.go:36] "Initialized new in-memory state store" Jan 14 01:31:40.627030 kubelet[2561]: E0114 01:31:40.626969 2561 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4578-0-0-p-557efd55ff\" not found" Jan 14 01:31:40.707006 kubelet[2561]: E0114 01:31:40.648288 2561 kubelet.go:2406] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Jan 14 01:31:40.707006 kubelet[2561]: I0114 01:31:40.706124 2561 policy_none.go:49] "None policy: Start" Jan 14 01:31:40.707006 kubelet[2561]: I0114 01:31:40.706156 2561 memory_manager.go:186] "Starting memorymanager" policy="None" Jan 14 01:31:40.707006 kubelet[2561]: I0114 01:31:40.706171 2561 state_mem.go:35] "Initializing new in-memory state store" Jan 14 01:31:40.727964 kubelet[2561]: E0114 01:31:40.727883 2561 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4578-0-0-p-557efd55ff\" not found" Jan 14 01:31:40.749306 kubelet[2561]: E0114 01:31:40.749248 2561 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.22.183:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4578-0-0-p-557efd55ff?timeout=10s\": dial tcp 10.0.22.183:6443: connect: connection refused" interval="400ms" Jan 14 01:31:40.828297 kubelet[2561]: E0114 01:31:40.828210 2561 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4578-0-0-p-557efd55ff\" not found" Jan 14 01:31:40.848676 kubelet[2561]: E0114 01:31:40.848623 2561 kubelet.go:2406] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Jan 14 01:31:40.929120 kubelet[2561]: E0114 01:31:40.929070 2561 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4578-0-0-p-557efd55ff\" not found" Jan 14 01:31:41.656481 kubelet[2561]: E0114 01:31:41.030109 2561 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4578-0-0-p-557efd55ff\" not found" Jan 14 01:31:41.656481 kubelet[2561]: E0114 01:31:41.130810 2561 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4578-0-0-p-557efd55ff\" not found" Jan 14 01:31:41.656481 kubelet[2561]: E0114 01:31:41.150502 2561 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.22.183:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4578-0-0-p-557efd55ff?timeout=10s\": dial tcp 10.0.22.183:6443: connect: connection refused" interval="800ms" Jan 14 01:31:41.656481 kubelet[2561]: E0114 01:31:41.231093 2561 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4578-0-0-p-557efd55ff\" not found" Jan 14 01:31:41.656481 kubelet[2561]: E0114 01:31:41.249366 2561 kubelet.go:2406] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Jan 14 01:31:41.656481 kubelet[2561]: E0114 01:31:41.331992 2561 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4578-0-0-p-557efd55ff\" not found" Jan 14 01:31:41.656481 kubelet[2561]: W0114 01:31:41.343680 2561 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.22.183:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4578-0-0-p-557efd55ff&limit=500&resourceVersion=0": dial tcp 10.0.22.183:6443: connect: connection refused Jan 14 01:31:41.656481 kubelet[2561]: E0114 01:31:41.343765 2561 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.22.183:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4578-0-0-p-557efd55ff&limit=500&resourceVersion=0\": dial tcp 10.0.22.183:6443: connect: connection refused" logger="UnhandledError" Jan 14 01:31:41.656481 kubelet[2561]: E0114 01:31:41.432617 2561 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4578-0-0-p-557efd55ff\" not found" Jan 14 01:31:41.656481 kubelet[2561]: E0114 01:31:41.533107 2561 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4578-0-0-p-557efd55ff\" not found" Jan 14 01:31:41.657457 kubelet[2561]: E0114 01:31:41.633635 2561 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4578-0-0-p-557efd55ff\" not found" Jan 14 01:31:41.657457 kubelet[2561]: W0114 01:31:41.651832 2561 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.22.183:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.22.183:6443: connect: connection refused Jan 14 01:31:41.657457 kubelet[2561]: E0114 01:31:41.651996 2561 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.22.183:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.22.183:6443: connect: connection refused" logger="UnhandledError" Jan 14 01:31:41.668407 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Jan 14 01:31:41.681619 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Jan 14 01:31:41.689536 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Jan 14 01:31:41.690439 kubelet[2561]: W0114 01:31:41.690352 2561 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.22.183:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.22.183:6443: connect: connection refused Jan 14 01:31:41.690439 kubelet[2561]: E0114 01:31:41.690419 2561 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.22.183:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.22.183:6443: connect: connection refused" logger="UnhandledError" Jan 14 01:31:41.698679 kubelet[2561]: I0114 01:31:41.698452 2561 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 14 01:31:41.698679 kubelet[2561]: I0114 01:31:41.698645 2561 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 14 01:31:41.698943 kubelet[2561]: I0114 01:31:41.698656 2561 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 14 01:31:41.699628 kubelet[2561]: I0114 01:31:41.699617 2561 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 14 01:31:41.701097 kubelet[2561]: E0114 01:31:41.701082 2561 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jan 14 01:31:41.701206 kubelet[2561]: E0114 01:31:41.701180 2561 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4578-0-0-p-557efd55ff\" not found" Jan 14 01:31:41.801962 kubelet[2561]: I0114 01:31:41.801908 2561 kubelet_node_status.go:75] "Attempting to register node" node="ci-4578-0-0-p-557efd55ff" Jan 14 01:31:41.802591 kubelet[2561]: E0114 01:31:41.802569 2561 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.22.183:6443/api/v1/nodes\": dial tcp 10.0.22.183:6443: connect: connection refused" node="ci-4578-0-0-p-557efd55ff" Jan 14 01:31:41.849547 kubelet[2561]: W0114 01:31:41.849474 2561 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.22.183:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.22.183:6443: connect: connection refused Jan 14 01:31:41.849547 kubelet[2561]: E0114 01:31:41.849521 2561 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.22.183:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.22.183:6443: connect: connection refused" logger="UnhandledError" Jan 14 01:31:41.952349 kubelet[2561]: E0114 01:31:41.952169 2561 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.22.183:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4578-0-0-p-557efd55ff?timeout=10s\": dial tcp 10.0.22.183:6443: connect: connection refused" interval="1.6s" Jan 14 01:31:42.006368 kubelet[2561]: I0114 01:31:42.006307 2561 kubelet_node_status.go:75] "Attempting to register node" node="ci-4578-0-0-p-557efd55ff" Jan 14 01:31:42.007027 kubelet[2561]: E0114 01:31:42.006949 2561 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.22.183:6443/api/v1/nodes\": dial tcp 10.0.22.183:6443: connect: connection refused" node="ci-4578-0-0-p-557efd55ff" Jan 14 01:31:42.064872 systemd[1]: Created slice kubepods-burstable-pod27e73373a1da7666e92e1356b1b10768.slice - libcontainer container kubepods-burstable-pod27e73373a1da7666e92e1356b1b10768.slice. Jan 14 01:31:42.076328 kubelet[2561]: E0114 01:31:42.076055 2561 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4578-0-0-p-557efd55ff\" not found" node="ci-4578-0-0-p-557efd55ff" Jan 14 01:31:42.081159 systemd[1]: Created slice kubepods-burstable-pod141622e70e2aacc762987119dd29e1d8.slice - libcontainer container kubepods-burstable-pod141622e70e2aacc762987119dd29e1d8.slice. Jan 14 01:31:42.085360 kubelet[2561]: E0114 01:31:42.085320 2561 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4578-0-0-p-557efd55ff\" not found" node="ci-4578-0-0-p-557efd55ff" Jan 14 01:31:42.090302 systemd[1]: Created slice kubepods-burstable-pod60950ca34439cde2704d1e09689cb00f.slice - libcontainer container kubepods-burstable-pod60950ca34439cde2704d1e09689cb00f.slice. Jan 14 01:31:42.093622 kubelet[2561]: E0114 01:31:42.093346 2561 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4578-0-0-p-557efd55ff\" not found" node="ci-4578-0-0-p-557efd55ff" Jan 14 01:31:42.137050 kubelet[2561]: I0114 01:31:42.136978 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/27e73373a1da7666e92e1356b1b10768-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4578-0-0-p-557efd55ff\" (UID: \"27e73373a1da7666e92e1356b1b10768\") " pod="kube-system/kube-apiserver-ci-4578-0-0-p-557efd55ff" Jan 14 01:31:42.137281 kubelet[2561]: I0114 01:31:42.137064 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/141622e70e2aacc762987119dd29e1d8-k8s-certs\") pod \"kube-controller-manager-ci-4578-0-0-p-557efd55ff\" (UID: \"141622e70e2aacc762987119dd29e1d8\") " pod="kube-system/kube-controller-manager-ci-4578-0-0-p-557efd55ff" Jan 14 01:31:42.137281 kubelet[2561]: I0114 01:31:42.137108 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/60950ca34439cde2704d1e09689cb00f-kubeconfig\") pod \"kube-scheduler-ci-4578-0-0-p-557efd55ff\" (UID: \"60950ca34439cde2704d1e09689cb00f\") " pod="kube-system/kube-scheduler-ci-4578-0-0-p-557efd55ff" Jan 14 01:31:42.137281 kubelet[2561]: I0114 01:31:42.137150 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/27e73373a1da7666e92e1356b1b10768-ca-certs\") pod \"kube-apiserver-ci-4578-0-0-p-557efd55ff\" (UID: \"27e73373a1da7666e92e1356b1b10768\") " pod="kube-system/kube-apiserver-ci-4578-0-0-p-557efd55ff" Jan 14 01:31:42.137281 kubelet[2561]: I0114 01:31:42.137187 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/27e73373a1da7666e92e1356b1b10768-k8s-certs\") pod \"kube-apiserver-ci-4578-0-0-p-557efd55ff\" (UID: \"27e73373a1da7666e92e1356b1b10768\") " pod="kube-system/kube-apiserver-ci-4578-0-0-p-557efd55ff" Jan 14 01:31:42.137281 kubelet[2561]: I0114 01:31:42.137222 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/141622e70e2aacc762987119dd29e1d8-ca-certs\") pod \"kube-controller-manager-ci-4578-0-0-p-557efd55ff\" (UID: \"141622e70e2aacc762987119dd29e1d8\") " pod="kube-system/kube-controller-manager-ci-4578-0-0-p-557efd55ff" Jan 14 01:31:42.137517 kubelet[2561]: I0114 01:31:42.137264 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/141622e70e2aacc762987119dd29e1d8-flexvolume-dir\") pod \"kube-controller-manager-ci-4578-0-0-p-557efd55ff\" (UID: \"141622e70e2aacc762987119dd29e1d8\") " pod="kube-system/kube-controller-manager-ci-4578-0-0-p-557efd55ff" Jan 14 01:31:42.137517 kubelet[2561]: I0114 01:31:42.137300 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/141622e70e2aacc762987119dd29e1d8-kubeconfig\") pod \"kube-controller-manager-ci-4578-0-0-p-557efd55ff\" (UID: \"141622e70e2aacc762987119dd29e1d8\") " pod="kube-system/kube-controller-manager-ci-4578-0-0-p-557efd55ff" Jan 14 01:31:42.137517 kubelet[2561]: I0114 01:31:42.137338 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/141622e70e2aacc762987119dd29e1d8-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4578-0-0-p-557efd55ff\" (UID: \"141622e70e2aacc762987119dd29e1d8\") " pod="kube-system/kube-controller-manager-ci-4578-0-0-p-557efd55ff" Jan 14 01:31:42.317258 kubelet[2561]: E0114 01:31:42.317217 2561 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.0.22.183:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.22.183:6443: connect: connection refused" logger="UnhandledError" Jan 14 01:31:42.378866 containerd[1684]: time="2026-01-14T01:31:42.378726940Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4578-0-0-p-557efd55ff,Uid:27e73373a1da7666e92e1356b1b10768,Namespace:kube-system,Attempt:0,}" Jan 14 01:31:42.387566 containerd[1684]: time="2026-01-14T01:31:42.387484421Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4578-0-0-p-557efd55ff,Uid:141622e70e2aacc762987119dd29e1d8,Namespace:kube-system,Attempt:0,}" Jan 14 01:31:42.394690 containerd[1684]: time="2026-01-14T01:31:42.394615975Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4578-0-0-p-557efd55ff,Uid:60950ca34439cde2704d1e09689cb00f,Namespace:kube-system,Attempt:0,}" Jan 14 01:31:42.412171 kubelet[2561]: I0114 01:31:42.412128 2561 kubelet_node_status.go:75] "Attempting to register node" node="ci-4578-0-0-p-557efd55ff" Jan 14 01:31:42.413581 kubelet[2561]: E0114 01:31:42.413535 2561 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.22.183:6443/api/v1/nodes\": dial tcp 10.0.22.183:6443: connect: connection refused" node="ci-4578-0-0-p-557efd55ff" Jan 14 01:31:42.451297 containerd[1684]: time="2026-01-14T01:31:42.450156888Z" level=info msg="connecting to shim 339def40388bcd2d97bec53b8aa77531f8334cd6f6584d19028b88d75b0aa511" address="unix:///run/containerd/s/9f9ca5ad7ee154fc2689f2557acfe5d9d3fd3cba753dab0eaaee5faac9181e87" namespace=k8s.io protocol=ttrpc version=3 Jan 14 01:31:42.480080 containerd[1684]: time="2026-01-14T01:31:42.480015930Z" level=info msg="connecting to shim f5006eae0a1916432390383d5fa63d8a2cbee1e3dc9dae2a84a3dd1d6b8cad3a" address="unix:///run/containerd/s/60b3286c09c1c34d2c4a71b1f2a8dc3d092805b02167a274ad4ba3c8e99b8fc2" namespace=k8s.io protocol=ttrpc version=3 Jan 14 01:31:42.491384 containerd[1684]: time="2026-01-14T01:31:42.491336796Z" level=info msg="connecting to shim 0434e14efaa8c4e670c5ddc44ad3ee32972bc016d7d0915a7a5150e5417c6db9" address="unix:///run/containerd/s/60f022b80d018d7878e94938b05bdb3616a524ee98254817c9adf6503f7bc1d8" namespace=k8s.io protocol=ttrpc version=3 Jan 14 01:31:42.516055 systemd[1]: Started cri-containerd-339def40388bcd2d97bec53b8aa77531f8334cd6f6584d19028b88d75b0aa511.scope - libcontainer container 339def40388bcd2d97bec53b8aa77531f8334cd6f6584d19028b88d75b0aa511. Jan 14 01:31:42.531134 systemd[1]: Started cri-containerd-0434e14efaa8c4e670c5ddc44ad3ee32972bc016d7d0915a7a5150e5417c6db9.scope - libcontainer container 0434e14efaa8c4e670c5ddc44ad3ee32972bc016d7d0915a7a5150e5417c6db9. Jan 14 01:31:42.538276 kernel: kauditd_printk_skb: 70 callbacks suppressed Jan 14 01:31:42.538375 kernel: audit: type=1334 audit(1768354302.535:347): prog-id=83 op=LOAD Jan 14 01:31:42.535000 audit: BPF prog-id=83 op=LOAD Jan 14 01:31:42.541000 audit: BPF prog-id=84 op=LOAD Jan 14 01:31:42.547283 kernel: audit: type=1334 audit(1768354302.541:348): prog-id=84 op=LOAD Jan 14 01:31:42.547365 kernel: audit: type=1300 audit(1768354302.541:348): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a8238 a2=98 a3=0 items=0 ppid=2599 pid=2626 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:42.541000 audit[2626]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a8238 a2=98 a3=0 items=0 ppid=2599 pid=2626 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:42.551525 kernel: audit: type=1327 audit(1768354302.541:348): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3333396465663430333838626364326439376265633533623861613737 Jan 14 01:31:42.541000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3333396465663430333838626364326439376265633533623861613737 Jan 14 01:31:42.541000 audit: BPF prog-id=84 op=UNLOAD Jan 14 01:31:42.558430 kernel: audit: type=1334 audit(1768354302.541:349): prog-id=84 op=UNLOAD Jan 14 01:31:42.558502 kernel: audit: type=1300 audit(1768354302.541:349): arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2599 pid=2626 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:42.541000 audit[2626]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2599 pid=2626 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:42.561886 kernel: audit: type=1327 audit(1768354302.541:349): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3333396465663430333838626364326439376265633533623861613737 Jan 14 01:31:42.541000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3333396465663430333838626364326439376265633533623861613737 Jan 14 01:31:42.541000 audit: BPF prog-id=85 op=LOAD Jan 14 01:31:42.563186 kernel: audit: type=1334 audit(1768354302.541:350): prog-id=85 op=LOAD Jan 14 01:31:42.564194 kernel: audit: type=1300 audit(1768354302.541:350): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a8488 a2=98 a3=0 items=0 ppid=2599 pid=2626 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:42.541000 audit[2626]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a8488 a2=98 a3=0 items=0 ppid=2599 pid=2626 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:42.541000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3333396465663430333838626364326439376265633533623861613737 Jan 14 01:31:42.575998 kernel: audit: type=1327 audit(1768354302.541:350): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3333396465663430333838626364326439376265633533623861613737 Jan 14 01:31:42.541000 audit: BPF prog-id=86 op=LOAD Jan 14 01:31:42.541000 audit[2626]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001a8218 a2=98 a3=0 items=0 ppid=2599 pid=2626 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:42.541000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3333396465663430333838626364326439376265633533623861613737 Jan 14 01:31:42.541000 audit: BPF prog-id=86 op=UNLOAD Jan 14 01:31:42.541000 audit[2626]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2599 pid=2626 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:42.541000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3333396465663430333838626364326439376265633533623861613737 Jan 14 01:31:42.541000 audit: BPF prog-id=85 op=UNLOAD Jan 14 01:31:42.541000 audit[2626]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2599 pid=2626 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:42.541000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3333396465663430333838626364326439376265633533623861613737 Jan 14 01:31:42.541000 audit: BPF prog-id=87 op=LOAD Jan 14 01:31:42.541000 audit[2626]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a86e8 a2=98 a3=0 items=0 ppid=2599 pid=2626 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:42.541000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3333396465663430333838626364326439376265633533623861613737 Jan 14 01:31:42.552000 audit: BPF prog-id=88 op=LOAD Jan 14 01:31:42.552000 audit: BPF prog-id=89 op=LOAD Jan 14 01:31:42.552000 audit[2663]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=2640 pid=2663 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:42.552000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3034333465313465666161386334653637306335646463343461643365 Jan 14 01:31:42.552000 audit: BPF prog-id=89 op=UNLOAD Jan 14 01:31:42.552000 audit[2663]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2640 pid=2663 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:42.552000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3034333465313465666161386334653637306335646463343461643365 Jan 14 01:31:42.553000 audit: BPF prog-id=90 op=LOAD Jan 14 01:31:42.553000 audit[2663]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=2640 pid=2663 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:42.553000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3034333465313465666161386334653637306335646463343461643365 Jan 14 01:31:42.553000 audit: BPF prog-id=91 op=LOAD Jan 14 01:31:42.553000 audit[2663]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=2640 pid=2663 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:42.553000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3034333465313465666161386334653637306335646463343461643365 Jan 14 01:31:42.553000 audit: BPF prog-id=91 op=UNLOAD Jan 14 01:31:42.553000 audit[2663]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2640 pid=2663 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:42.553000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3034333465313465666161386334653637306335646463343461643365 Jan 14 01:31:42.553000 audit: BPF prog-id=90 op=UNLOAD Jan 14 01:31:42.553000 audit[2663]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2640 pid=2663 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:42.553000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3034333465313465666161386334653637306335646463343461643365 Jan 14 01:31:42.553000 audit: BPF prog-id=92 op=LOAD Jan 14 01:31:42.553000 audit[2663]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=2640 pid=2663 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:42.553000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3034333465313465666161386334653637306335646463343461643365 Jan 14 01:31:42.581316 systemd[1]: Started cri-containerd-f5006eae0a1916432390383d5fa63d8a2cbee1e3dc9dae2a84a3dd1d6b8cad3a.scope - libcontainer container f5006eae0a1916432390383d5fa63d8a2cbee1e3dc9dae2a84a3dd1d6b8cad3a. Jan 14 01:31:42.601000 audit: BPF prog-id=93 op=LOAD Jan 14 01:31:42.607000 audit: BPF prog-id=94 op=LOAD Jan 14 01:31:42.607000 audit[2660]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000106238 a2=98 a3=0 items=0 ppid=2624 pid=2660 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:42.607000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6635303036656165306131393136343332333930333833643566613633 Jan 14 01:31:42.608000 audit: BPF prog-id=94 op=UNLOAD Jan 14 01:31:42.608000 audit[2660]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2624 pid=2660 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:42.608000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6635303036656165306131393136343332333930333833643566613633 Jan 14 01:31:42.608000 audit: BPF prog-id=95 op=LOAD Jan 14 01:31:42.608000 audit[2660]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000106488 a2=98 a3=0 items=0 ppid=2624 pid=2660 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:42.608000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6635303036656165306131393136343332333930333833643566613633 Jan 14 01:31:42.608000 audit: BPF prog-id=96 op=LOAD Jan 14 01:31:42.608000 audit[2660]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000106218 a2=98 a3=0 items=0 ppid=2624 pid=2660 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:42.608000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6635303036656165306131393136343332333930333833643566613633 Jan 14 01:31:42.608000 audit: BPF prog-id=96 op=UNLOAD Jan 14 01:31:42.608000 audit[2660]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2624 pid=2660 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:42.608000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6635303036656165306131393136343332333930333833643566613633 Jan 14 01:31:42.608000 audit: BPF prog-id=95 op=UNLOAD Jan 14 01:31:42.608000 audit[2660]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2624 pid=2660 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:42.608000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6635303036656165306131393136343332333930333833643566613633 Jan 14 01:31:42.608000 audit: BPF prog-id=97 op=LOAD Jan 14 01:31:42.608000 audit[2660]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001066e8 a2=98 a3=0 items=0 ppid=2624 pid=2660 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:42.608000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6635303036656165306131393136343332333930333833643566613633 Jan 14 01:31:42.614152 containerd[1684]: time="2026-01-14T01:31:42.613940822Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4578-0-0-p-557efd55ff,Uid:27e73373a1da7666e92e1356b1b10768,Namespace:kube-system,Attempt:0,} returns sandbox id \"339def40388bcd2d97bec53b8aa77531f8334cd6f6584d19028b88d75b0aa511\"" Jan 14 01:31:42.620733 containerd[1684]: time="2026-01-14T01:31:42.620708774Z" level=info msg="CreateContainer within sandbox \"339def40388bcd2d97bec53b8aa77531f8334cd6f6584d19028b88d75b0aa511\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Jan 14 01:31:42.635181 containerd[1684]: time="2026-01-14T01:31:42.635112602Z" level=info msg="Container c856333ddff7d5efec92cddb11039dbb6fb16280428a94f9e8aab79fcf8d1f0d: CDI devices from CRI Config.CDIDevices: []" Jan 14 01:31:42.636557 containerd[1684]: time="2026-01-14T01:31:42.636512191Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4578-0-0-p-557efd55ff,Uid:60950ca34439cde2704d1e09689cb00f,Namespace:kube-system,Attempt:0,} returns sandbox id \"0434e14efaa8c4e670c5ddc44ad3ee32972bc016d7d0915a7a5150e5417c6db9\"" Jan 14 01:31:42.640425 containerd[1684]: time="2026-01-14T01:31:42.640239700Z" level=info msg="CreateContainer within sandbox \"0434e14efaa8c4e670c5ddc44ad3ee32972bc016d7d0915a7a5150e5417c6db9\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Jan 14 01:31:42.649317 containerd[1684]: time="2026-01-14T01:31:42.649292174Z" level=info msg="CreateContainer within sandbox \"339def40388bcd2d97bec53b8aa77531f8334cd6f6584d19028b88d75b0aa511\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"c856333ddff7d5efec92cddb11039dbb6fb16280428a94f9e8aab79fcf8d1f0d\"" Jan 14 01:31:42.650174 containerd[1684]: time="2026-01-14T01:31:42.650140607Z" level=info msg="StartContainer for \"c856333ddff7d5efec92cddb11039dbb6fb16280428a94f9e8aab79fcf8d1f0d\"" Jan 14 01:31:42.651262 containerd[1684]: time="2026-01-14T01:31:42.651237947Z" level=info msg="connecting to shim c856333ddff7d5efec92cddb11039dbb6fb16280428a94f9e8aab79fcf8d1f0d" address="unix:///run/containerd/s/9f9ca5ad7ee154fc2689f2557acfe5d9d3fd3cba753dab0eaaee5faac9181e87" protocol=ttrpc version=3 Jan 14 01:31:42.657832 containerd[1684]: time="2026-01-14T01:31:42.657811961Z" level=info msg="Container b41062327d0f4eac335e2505dccc3170193ba8efe0145922c43ed640c2af6956: CDI devices from CRI Config.CDIDevices: []" Jan 14 01:31:42.659990 containerd[1684]: time="2026-01-14T01:31:42.659962055Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4578-0-0-p-557efd55ff,Uid:141622e70e2aacc762987119dd29e1d8,Namespace:kube-system,Attempt:0,} returns sandbox id \"f5006eae0a1916432390383d5fa63d8a2cbee1e3dc9dae2a84a3dd1d6b8cad3a\"" Jan 14 01:31:42.662699 containerd[1684]: time="2026-01-14T01:31:42.662669514Z" level=info msg="CreateContainer within sandbox \"f5006eae0a1916432390383d5fa63d8a2cbee1e3dc9dae2a84a3dd1d6b8cad3a\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Jan 14 01:31:42.676208 systemd[1]: Started cri-containerd-c856333ddff7d5efec92cddb11039dbb6fb16280428a94f9e8aab79fcf8d1f0d.scope - libcontainer container c856333ddff7d5efec92cddb11039dbb6fb16280428a94f9e8aab79fcf8d1f0d. Jan 14 01:31:42.678653 containerd[1684]: time="2026-01-14T01:31:42.678578975Z" level=info msg="CreateContainer within sandbox \"0434e14efaa8c4e670c5ddc44ad3ee32972bc016d7d0915a7a5150e5417c6db9\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"b41062327d0f4eac335e2505dccc3170193ba8efe0145922c43ed640c2af6956\"" Jan 14 01:31:42.679502 containerd[1684]: time="2026-01-14T01:31:42.679479364Z" level=info msg="StartContainer for \"b41062327d0f4eac335e2505dccc3170193ba8efe0145922c43ed640c2af6956\"" Jan 14 01:31:42.680793 containerd[1684]: time="2026-01-14T01:31:42.680667265Z" level=info msg="connecting to shim b41062327d0f4eac335e2505dccc3170193ba8efe0145922c43ed640c2af6956" address="unix:///run/containerd/s/60f022b80d018d7878e94938b05bdb3616a524ee98254817c9adf6503f7bc1d8" protocol=ttrpc version=3 Jan 14 01:31:42.682293 containerd[1684]: time="2026-01-14T01:31:42.682270230Z" level=info msg="Container 958152e2a9b58113f9267aa26199d633f3011875c049b42bfa1ccea1bb39c893: CDI devices from CRI Config.CDIDevices: []" Jan 14 01:31:42.691000 audit: BPF prog-id=98 op=LOAD Jan 14 01:31:42.693000 audit: BPF prog-id=99 op=LOAD Jan 14 01:31:42.693000 audit[2729]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00010c238 a2=98 a3=0 items=0 ppid=2599 pid=2729 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:42.693000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6338353633333364646666376435656665633932636464623131303339 Jan 14 01:31:42.693000 audit: BPF prog-id=99 op=UNLOAD Jan 14 01:31:42.693000 audit[2729]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2599 pid=2729 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:42.693000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6338353633333364646666376435656665633932636464623131303339 Jan 14 01:31:42.695000 audit: BPF prog-id=100 op=LOAD Jan 14 01:31:42.695000 audit[2729]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00010c488 a2=98 a3=0 items=0 ppid=2599 pid=2729 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:42.695000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6338353633333364646666376435656665633932636464623131303339 Jan 14 01:31:42.697004 containerd[1684]: time="2026-01-14T01:31:42.696977101Z" level=info msg="CreateContainer within sandbox \"f5006eae0a1916432390383d5fa63d8a2cbee1e3dc9dae2a84a3dd1d6b8cad3a\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"958152e2a9b58113f9267aa26199d633f3011875c049b42bfa1ccea1bb39c893\"" Jan 14 01:31:42.696000 audit: BPF prog-id=101 op=LOAD Jan 14 01:31:42.696000 audit[2729]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00010c218 a2=98 a3=0 items=0 ppid=2599 pid=2729 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:42.696000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6338353633333364646666376435656665633932636464623131303339 Jan 14 01:31:42.696000 audit: BPF prog-id=101 op=UNLOAD Jan 14 01:31:42.696000 audit[2729]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2599 pid=2729 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:42.696000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6338353633333364646666376435656665633932636464623131303339 Jan 14 01:31:42.696000 audit: BPF prog-id=100 op=UNLOAD Jan 14 01:31:42.696000 audit[2729]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2599 pid=2729 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:42.696000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6338353633333364646666376435656665633932636464623131303339 Jan 14 01:31:42.696000 audit: BPF prog-id=102 op=LOAD Jan 14 01:31:42.696000 audit[2729]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00010c6e8 a2=98 a3=0 items=0 ppid=2599 pid=2729 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:42.696000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6338353633333364646666376435656665633932636464623131303339 Jan 14 01:31:42.699605 containerd[1684]: time="2026-01-14T01:31:42.699461759Z" level=info msg="StartContainer for \"958152e2a9b58113f9267aa26199d633f3011875c049b42bfa1ccea1bb39c893\"" Jan 14 01:31:42.700981 containerd[1684]: time="2026-01-14T01:31:42.700929221Z" level=info msg="connecting to shim 958152e2a9b58113f9267aa26199d633f3011875c049b42bfa1ccea1bb39c893" address="unix:///run/containerd/s/60b3286c09c1c34d2c4a71b1f2a8dc3d092805b02167a274ad4ba3c8e99b8fc2" protocol=ttrpc version=3 Jan 14 01:31:42.702049 systemd[1]: Started cri-containerd-b41062327d0f4eac335e2505dccc3170193ba8efe0145922c43ed640c2af6956.scope - libcontainer container b41062327d0f4eac335e2505dccc3170193ba8efe0145922c43ed640c2af6956. Jan 14 01:31:42.729194 systemd[1]: Started cri-containerd-958152e2a9b58113f9267aa26199d633f3011875c049b42bfa1ccea1bb39c893.scope - libcontainer container 958152e2a9b58113f9267aa26199d633f3011875c049b42bfa1ccea1bb39c893. Jan 14 01:31:42.732000 audit: BPF prog-id=103 op=LOAD Jan 14 01:31:42.734000 audit: BPF prog-id=104 op=LOAD Jan 14 01:31:42.734000 audit[2748]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=2640 pid=2748 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:42.734000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6234313036323332376430663465616333333565323530356463636333 Jan 14 01:31:42.735000 audit: BPF prog-id=104 op=UNLOAD Jan 14 01:31:42.735000 audit[2748]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2640 pid=2748 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:42.735000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6234313036323332376430663465616333333565323530356463636333 Jan 14 01:31:42.735000 audit: BPF prog-id=105 op=LOAD Jan 14 01:31:42.735000 audit[2748]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=2640 pid=2748 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:42.735000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6234313036323332376430663465616333333565323530356463636333 Jan 14 01:31:42.735000 audit: BPF prog-id=106 op=LOAD Jan 14 01:31:42.735000 audit[2748]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=2640 pid=2748 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:42.735000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6234313036323332376430663465616333333565323530356463636333 Jan 14 01:31:42.735000 audit: BPF prog-id=106 op=UNLOAD Jan 14 01:31:42.735000 audit[2748]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2640 pid=2748 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:42.735000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6234313036323332376430663465616333333565323530356463636333 Jan 14 01:31:42.735000 audit: BPF prog-id=105 op=UNLOAD Jan 14 01:31:42.735000 audit[2748]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2640 pid=2748 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:42.735000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6234313036323332376430663465616333333565323530356463636333 Jan 14 01:31:42.735000 audit: BPF prog-id=107 op=LOAD Jan 14 01:31:42.735000 audit[2748]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=2640 pid=2748 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:42.735000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6234313036323332376430663465616333333565323530356463636333 Jan 14 01:31:42.749000 audit: BPF prog-id=108 op=LOAD Jan 14 01:31:42.752000 audit: BPF prog-id=109 op=LOAD Jan 14 01:31:42.752000 audit[2760]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=2624 pid=2760 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:42.752000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3935383135326532613962353831313366393236376161323631393964 Jan 14 01:31:42.752000 audit: BPF prog-id=109 op=UNLOAD Jan 14 01:31:42.752000 audit[2760]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2624 pid=2760 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:42.752000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3935383135326532613962353831313366393236376161323631393964 Jan 14 01:31:42.752000 audit: BPF prog-id=110 op=LOAD Jan 14 01:31:42.752000 audit[2760]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=2624 pid=2760 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:42.752000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3935383135326532613962353831313366393236376161323631393964 Jan 14 01:31:42.752000 audit: BPF prog-id=111 op=LOAD Jan 14 01:31:42.752000 audit[2760]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=2624 pid=2760 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:42.752000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3935383135326532613962353831313366393236376161323631393964 Jan 14 01:31:42.752000 audit: BPF prog-id=111 op=UNLOAD Jan 14 01:31:42.752000 audit[2760]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2624 pid=2760 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:42.752000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3935383135326532613962353831313366393236376161323631393964 Jan 14 01:31:42.752000 audit: BPF prog-id=110 op=UNLOAD Jan 14 01:31:42.752000 audit[2760]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2624 pid=2760 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:42.752000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3935383135326532613962353831313366393236376161323631393964 Jan 14 01:31:42.752000 audit: BPF prog-id=112 op=LOAD Jan 14 01:31:42.752000 audit[2760]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=2624 pid=2760 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:42.752000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3935383135326532613962353831313366393236376161323631393964 Jan 14 01:31:42.756943 containerd[1684]: time="2026-01-14T01:31:42.756758425Z" level=info msg="StartContainer for \"c856333ddff7d5efec92cddb11039dbb6fb16280428a94f9e8aab79fcf8d1f0d\" returns successfully" Jan 14 01:31:42.812019 containerd[1684]: time="2026-01-14T01:31:42.811760852Z" level=info msg="StartContainer for \"958152e2a9b58113f9267aa26199d633f3011875c049b42bfa1ccea1bb39c893\" returns successfully" Jan 14 01:31:42.828683 containerd[1684]: time="2026-01-14T01:31:42.828557215Z" level=info msg="StartContainer for \"b41062327d0f4eac335e2505dccc3170193ba8efe0145922c43ed640c2af6956\" returns successfully" Jan 14 01:31:43.216177 kubelet[2561]: I0114 01:31:43.216025 2561 kubelet_node_status.go:75] "Attempting to register node" node="ci-4578-0-0-p-557efd55ff" Jan 14 01:31:43.577138 kubelet[2561]: E0114 01:31:43.577110 2561 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4578-0-0-p-557efd55ff\" not found" node="ci-4578-0-0-p-557efd55ff" Jan 14 01:31:43.579384 kubelet[2561]: E0114 01:31:43.579184 2561 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4578-0-0-p-557efd55ff\" not found" node="ci-4578-0-0-p-557efd55ff" Jan 14 01:31:43.581251 kubelet[2561]: E0114 01:31:43.581236 2561 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4578-0-0-p-557efd55ff\" not found" node="ci-4578-0-0-p-557efd55ff" Jan 14 01:31:44.066064 kubelet[2561]: E0114 01:31:44.066030 2561 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4578-0-0-p-557efd55ff\" not found" node="ci-4578-0-0-p-557efd55ff" Jan 14 01:31:44.108866 kubelet[2561]: I0114 01:31:44.107819 2561 kubelet_node_status.go:78] "Successfully registered node" node="ci-4578-0-0-p-557efd55ff" Jan 14 01:31:44.127866 kubelet[2561]: I0114 01:31:44.127264 2561 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4578-0-0-p-557efd55ff" Jan 14 01:31:44.210861 kubelet[2561]: E0114 01:31:44.209103 2561 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4578-0-0-p-557efd55ff\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4578-0-0-p-557efd55ff" Jan 14 01:31:44.210861 kubelet[2561]: I0114 01:31:44.209135 2561 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4578-0-0-p-557efd55ff" Jan 14 01:31:44.215866 kubelet[2561]: E0114 01:31:44.215838 2561 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4578-0-0-p-557efd55ff\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4578-0-0-p-557efd55ff" Jan 14 01:31:44.216152 kubelet[2561]: I0114 01:31:44.216010 2561 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4578-0-0-p-557efd55ff" Jan 14 01:31:44.221836 kubelet[2561]: E0114 01:31:44.221811 2561 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4578-0-0-p-557efd55ff\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4578-0-0-p-557efd55ff" Jan 14 01:31:44.500888 kubelet[2561]: I0114 01:31:44.500748 2561 apiserver.go:52] "Watching apiserver" Jan 14 01:31:44.528211 kubelet[2561]: I0114 01:31:44.528165 2561 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Jan 14 01:31:44.581211 kubelet[2561]: I0114 01:31:44.581186 2561 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4578-0-0-p-557efd55ff" Jan 14 01:31:44.581550 kubelet[2561]: I0114 01:31:44.581539 2561 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4578-0-0-p-557efd55ff" Jan 14 01:31:44.584437 kubelet[2561]: E0114 01:31:44.584416 2561 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4578-0-0-p-557efd55ff\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4578-0-0-p-557efd55ff" Jan 14 01:31:44.584640 kubelet[2561]: E0114 01:31:44.584629 2561 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4578-0-0-p-557efd55ff\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4578-0-0-p-557efd55ff" Jan 14 01:31:46.186941 systemd[1]: Reload requested from client PID 2827 ('systemctl') (unit session-10.scope)... Jan 14 01:31:46.187281 systemd[1]: Reloading... Jan 14 01:31:46.275865 zram_generator::config[2872]: No configuration found. Jan 14 01:31:46.477952 systemd[1]: Reloading finished in 290 ms. Jan 14 01:31:46.511165 kubelet[2561]: I0114 01:31:46.511070 2561 dynamic_cafile_content.go:175] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 14 01:31:46.512332 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 01:31:46.519340 systemd[1]: kubelet.service: Deactivated successfully. Jan 14 01:31:46.519764 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 01:31:46.519000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:31:46.519947 systemd[1]: kubelet.service: Consumed 752ms CPU time, 132.3M memory peak. Jan 14 01:31:46.521000 audit: BPF prog-id=113 op=LOAD Jan 14 01:31:46.521000 audit: BPF prog-id=80 op=UNLOAD Jan 14 01:31:46.521000 audit: BPF prog-id=114 op=LOAD Jan 14 01:31:46.521000 audit: BPF prog-id=115 op=LOAD Jan 14 01:31:46.521000 audit: BPF prog-id=81 op=UNLOAD Jan 14 01:31:46.521000 audit: BPF prog-id=82 op=UNLOAD Jan 14 01:31:46.522209 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 01:31:46.522000 audit: BPF prog-id=116 op=LOAD Jan 14 01:31:46.523000 audit: BPF prog-id=76 op=UNLOAD Jan 14 01:31:46.523000 audit: BPF prog-id=117 op=LOAD Jan 14 01:31:46.523000 audit: BPF prog-id=75 op=UNLOAD Jan 14 01:31:46.524000 audit: BPF prog-id=118 op=LOAD Jan 14 01:31:46.524000 audit: BPF prog-id=67 op=UNLOAD Jan 14 01:31:46.524000 audit: BPF prog-id=119 op=LOAD Jan 14 01:31:46.524000 audit: BPF prog-id=120 op=LOAD Jan 14 01:31:46.524000 audit: BPF prog-id=68 op=UNLOAD Jan 14 01:31:46.524000 audit: BPF prog-id=69 op=UNLOAD Jan 14 01:31:46.526000 audit: BPF prog-id=121 op=LOAD Jan 14 01:31:46.531000 audit: BPF prog-id=77 op=UNLOAD Jan 14 01:31:46.531000 audit: BPF prog-id=122 op=LOAD Jan 14 01:31:46.531000 audit: BPF prog-id=123 op=LOAD Jan 14 01:31:46.531000 audit: BPF prog-id=78 op=UNLOAD Jan 14 01:31:46.531000 audit: BPF prog-id=79 op=UNLOAD Jan 14 01:31:46.531000 audit: BPF prog-id=124 op=LOAD Jan 14 01:31:46.531000 audit: BPF prog-id=125 op=LOAD Jan 14 01:31:46.531000 audit: BPF prog-id=73 op=UNLOAD Jan 14 01:31:46.531000 audit: BPF prog-id=74 op=UNLOAD Jan 14 01:31:46.532000 audit: BPF prog-id=126 op=LOAD Jan 14 01:31:46.533000 audit: BPF prog-id=66 op=UNLOAD Jan 14 01:31:46.533000 audit: BPF prog-id=127 op=LOAD Jan 14 01:31:46.533000 audit: BPF prog-id=70 op=UNLOAD Jan 14 01:31:46.533000 audit: BPF prog-id=128 op=LOAD Jan 14 01:31:46.533000 audit: BPF prog-id=129 op=LOAD Jan 14 01:31:46.533000 audit: BPF prog-id=71 op=UNLOAD Jan 14 01:31:46.533000 audit: BPF prog-id=72 op=UNLOAD Jan 14 01:31:46.535000 audit: BPF prog-id=130 op=LOAD Jan 14 01:31:46.535000 audit: BPF prog-id=63 op=UNLOAD Jan 14 01:31:46.536000 audit: BPF prog-id=131 op=LOAD Jan 14 01:31:46.536000 audit: BPF prog-id=132 op=LOAD Jan 14 01:31:46.536000 audit: BPF prog-id=64 op=UNLOAD Jan 14 01:31:46.536000 audit: BPF prog-id=65 op=UNLOAD Jan 14 01:31:46.656293 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 01:31:46.656000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:31:46.663086 (kubelet)[2924]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 14 01:31:46.703866 kubelet[2924]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 14 01:31:46.703866 kubelet[2924]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jan 14 01:31:46.703866 kubelet[2924]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 14 01:31:46.703866 kubelet[2924]: I0114 01:31:46.703701 2924 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 14 01:31:46.710760 kubelet[2924]: I0114 01:31:46.710728 2924 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Jan 14 01:31:46.710973 kubelet[2924]: I0114 01:31:46.710918 2924 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 14 01:31:46.711255 kubelet[2924]: I0114 01:31:46.711244 2924 server.go:954] "Client rotation is on, will bootstrap in background" Jan 14 01:31:46.712830 kubelet[2924]: I0114 01:31:46.712634 2924 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jan 14 01:31:46.715036 kubelet[2924]: I0114 01:31:46.715019 2924 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 14 01:31:46.719456 kubelet[2924]: I0114 01:31:46.719436 2924 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 14 01:31:46.722613 kubelet[2924]: I0114 01:31:46.722600 2924 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 14 01:31:46.723011 kubelet[2924]: I0114 01:31:46.722989 2924 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 14 01:31:46.723297 kubelet[2924]: I0114 01:31:46.723069 2924 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4578-0-0-p-557efd55ff","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 14 01:31:46.723502 kubelet[2924]: I0114 01:31:46.723373 2924 topology_manager.go:138] "Creating topology manager with none policy" Jan 14 01:31:46.723502 kubelet[2924]: I0114 01:31:46.723384 2924 container_manager_linux.go:304] "Creating device plugin manager" Jan 14 01:31:46.723502 kubelet[2924]: I0114 01:31:46.723429 2924 state_mem.go:36] "Initialized new in-memory state store" Jan 14 01:31:46.723733 kubelet[2924]: I0114 01:31:46.723725 2924 kubelet.go:446] "Attempting to sync node with API server" Jan 14 01:31:46.723805 kubelet[2924]: I0114 01:31:46.723799 2924 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 14 01:31:46.723965 kubelet[2924]: I0114 01:31:46.723907 2924 kubelet.go:352] "Adding apiserver pod source" Jan 14 01:31:46.723965 kubelet[2924]: I0114 01:31:46.723921 2924 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 14 01:31:46.733864 kubelet[2924]: I0114 01:31:46.732300 2924 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Jan 14 01:31:46.733864 kubelet[2924]: I0114 01:31:46.732704 2924 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 14 01:31:46.733864 kubelet[2924]: I0114 01:31:46.733135 2924 watchdog_linux.go:99] "Systemd watchdog is not enabled" Jan 14 01:31:46.733864 kubelet[2924]: I0114 01:31:46.733161 2924 server.go:1287] "Started kubelet" Jan 14 01:31:46.737502 kubelet[2924]: I0114 01:31:46.737459 2924 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 14 01:31:46.737926 kubelet[2924]: I0114 01:31:46.737721 2924 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 14 01:31:46.738606 kubelet[2924]: I0114 01:31:46.738593 2924 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 14 01:31:46.739044 kubelet[2924]: I0114 01:31:46.738878 2924 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Jan 14 01:31:46.741424 kubelet[2924]: I0114 01:31:46.740723 2924 server.go:479] "Adding debug handlers to kubelet server" Jan 14 01:31:46.742366 kubelet[2924]: I0114 01:31:46.742351 2924 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 14 01:31:46.743029 kubelet[2924]: I0114 01:31:46.743015 2924 volume_manager.go:297] "Starting Kubelet Volume Manager" Jan 14 01:31:46.743209 kubelet[2924]: E0114 01:31:46.743196 2924 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4578-0-0-p-557efd55ff\" not found" Jan 14 01:31:46.743657 kubelet[2924]: I0114 01:31:46.743382 2924 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Jan 14 01:31:46.743657 kubelet[2924]: I0114 01:31:46.743489 2924 reconciler.go:26] "Reconciler: start to sync state" Jan 14 01:31:46.747879 kubelet[2924]: I0114 01:31:46.747862 2924 factory.go:221] Registration of the systemd container factory successfully Jan 14 01:31:46.747978 kubelet[2924]: I0114 01:31:46.747964 2924 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 14 01:31:46.749669 kubelet[2924]: I0114 01:31:46.749657 2924 factory.go:221] Registration of the containerd container factory successfully Jan 14 01:31:46.763959 kubelet[2924]: I0114 01:31:46.763927 2924 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 14 01:31:46.767583 kubelet[2924]: I0114 01:31:46.767536 2924 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 14 01:31:46.767583 kubelet[2924]: I0114 01:31:46.767566 2924 status_manager.go:227] "Starting to sync pod status with apiserver" Jan 14 01:31:46.767583 kubelet[2924]: I0114 01:31:46.767585 2924 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jan 14 01:31:46.767727 kubelet[2924]: I0114 01:31:46.767602 2924 kubelet.go:2382] "Starting kubelet main sync loop" Jan 14 01:31:46.767727 kubelet[2924]: E0114 01:31:46.767641 2924 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 14 01:31:46.790449 kubelet[2924]: I0114 01:31:46.790411 2924 cpu_manager.go:221] "Starting CPU manager" policy="none" Jan 14 01:31:46.790449 kubelet[2924]: I0114 01:31:46.790429 2924 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jan 14 01:31:46.790449 kubelet[2924]: I0114 01:31:46.790450 2924 state_mem.go:36] "Initialized new in-memory state store" Jan 14 01:31:46.790607 kubelet[2924]: I0114 01:31:46.790596 2924 state_mem.go:88] "Updated default CPUSet" cpuSet="" Jan 14 01:31:46.790651 kubelet[2924]: I0114 01:31:46.790609 2924 state_mem.go:96] "Updated CPUSet assignments" assignments={} Jan 14 01:31:46.790651 kubelet[2924]: I0114 01:31:46.790626 2924 policy_none.go:49] "None policy: Start" Jan 14 01:31:46.790651 kubelet[2924]: I0114 01:31:46.790636 2924 memory_manager.go:186] "Starting memorymanager" policy="None" Jan 14 01:31:46.790651 kubelet[2924]: I0114 01:31:46.790645 2924 state_mem.go:35] "Initializing new in-memory state store" Jan 14 01:31:46.790752 kubelet[2924]: I0114 01:31:46.790744 2924 state_mem.go:75] "Updated machine memory state" Jan 14 01:31:46.794062 kubelet[2924]: I0114 01:31:46.794028 2924 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 14 01:31:46.794195 kubelet[2924]: I0114 01:31:46.794185 2924 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 14 01:31:46.794236 kubelet[2924]: I0114 01:31:46.794199 2924 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 14 01:31:46.794772 kubelet[2924]: I0114 01:31:46.794759 2924 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 14 01:31:46.796613 kubelet[2924]: E0114 01:31:46.795458 2924 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jan 14 01:31:46.868966 kubelet[2924]: I0114 01:31:46.868927 2924 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4578-0-0-p-557efd55ff" Jan 14 01:31:46.870045 kubelet[2924]: I0114 01:31:46.869204 2924 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4578-0-0-p-557efd55ff" Jan 14 01:31:46.870142 kubelet[2924]: I0114 01:31:46.869328 2924 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4578-0-0-p-557efd55ff" Jan 14 01:31:46.898611 kubelet[2924]: I0114 01:31:46.898583 2924 kubelet_node_status.go:75] "Attempting to register node" node="ci-4578-0-0-p-557efd55ff" Jan 14 01:31:46.906487 kubelet[2924]: I0114 01:31:46.906462 2924 kubelet_node_status.go:124] "Node was previously registered" node="ci-4578-0-0-p-557efd55ff" Jan 14 01:31:46.906639 kubelet[2924]: I0114 01:31:46.906626 2924 kubelet_node_status.go:78] "Successfully registered node" node="ci-4578-0-0-p-557efd55ff" Jan 14 01:31:46.944129 kubelet[2924]: I0114 01:31:46.944074 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/60950ca34439cde2704d1e09689cb00f-kubeconfig\") pod \"kube-scheduler-ci-4578-0-0-p-557efd55ff\" (UID: \"60950ca34439cde2704d1e09689cb00f\") " pod="kube-system/kube-scheduler-ci-4578-0-0-p-557efd55ff" Jan 14 01:31:46.944465 kubelet[2924]: I0114 01:31:46.944337 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/27e73373a1da7666e92e1356b1b10768-k8s-certs\") pod \"kube-apiserver-ci-4578-0-0-p-557efd55ff\" (UID: \"27e73373a1da7666e92e1356b1b10768\") " pod="kube-system/kube-apiserver-ci-4578-0-0-p-557efd55ff" Jan 14 01:31:46.944465 kubelet[2924]: I0114 01:31:46.944367 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/141622e70e2aacc762987119dd29e1d8-ca-certs\") pod \"kube-controller-manager-ci-4578-0-0-p-557efd55ff\" (UID: \"141622e70e2aacc762987119dd29e1d8\") " pod="kube-system/kube-controller-manager-ci-4578-0-0-p-557efd55ff" Jan 14 01:31:46.944465 kubelet[2924]: I0114 01:31:46.944395 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/141622e70e2aacc762987119dd29e1d8-k8s-certs\") pod \"kube-controller-manager-ci-4578-0-0-p-557efd55ff\" (UID: \"141622e70e2aacc762987119dd29e1d8\") " pod="kube-system/kube-controller-manager-ci-4578-0-0-p-557efd55ff" Jan 14 01:31:46.944465 kubelet[2924]: I0114 01:31:46.944411 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/141622e70e2aacc762987119dd29e1d8-kubeconfig\") pod \"kube-controller-manager-ci-4578-0-0-p-557efd55ff\" (UID: \"141622e70e2aacc762987119dd29e1d8\") " pod="kube-system/kube-controller-manager-ci-4578-0-0-p-557efd55ff" Jan 14 01:31:46.944465 kubelet[2924]: I0114 01:31:46.944427 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/27e73373a1da7666e92e1356b1b10768-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4578-0-0-p-557efd55ff\" (UID: \"27e73373a1da7666e92e1356b1b10768\") " pod="kube-system/kube-apiserver-ci-4578-0-0-p-557efd55ff" Jan 14 01:31:46.944580 kubelet[2924]: I0114 01:31:46.944442 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/141622e70e2aacc762987119dd29e1d8-flexvolume-dir\") pod \"kube-controller-manager-ci-4578-0-0-p-557efd55ff\" (UID: \"141622e70e2aacc762987119dd29e1d8\") " pod="kube-system/kube-controller-manager-ci-4578-0-0-p-557efd55ff" Jan 14 01:31:46.944663 kubelet[2924]: I0114 01:31:46.944611 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/141622e70e2aacc762987119dd29e1d8-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4578-0-0-p-557efd55ff\" (UID: \"141622e70e2aacc762987119dd29e1d8\") " pod="kube-system/kube-controller-manager-ci-4578-0-0-p-557efd55ff" Jan 14 01:31:46.944663 kubelet[2924]: I0114 01:31:46.944632 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/27e73373a1da7666e92e1356b1b10768-ca-certs\") pod \"kube-apiserver-ci-4578-0-0-p-557efd55ff\" (UID: \"27e73373a1da7666e92e1356b1b10768\") " pod="kube-system/kube-apiserver-ci-4578-0-0-p-557efd55ff" Jan 14 01:31:47.734325 kubelet[2924]: I0114 01:31:47.734278 2924 apiserver.go:52] "Watching apiserver" Jan 14 01:31:47.743767 kubelet[2924]: I0114 01:31:47.743721 2924 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Jan 14 01:31:47.856737 kubelet[2924]: I0114 01:31:47.855921 2924 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4578-0-0-p-557efd55ff" podStartSLOduration=1.855903589 podStartE2EDuration="1.855903589s" podCreationTimestamp="2026-01-14 01:31:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-14 01:31:47.836360169 +0000 UTC m=+1.168589431" watchObservedRunningTime="2026-01-14 01:31:47.855903589 +0000 UTC m=+1.188132834" Jan 14 01:31:47.856737 kubelet[2924]: I0114 01:31:47.856053 2924 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4578-0-0-p-557efd55ff" podStartSLOduration=1.856048336 podStartE2EDuration="1.856048336s" podCreationTimestamp="2026-01-14 01:31:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-14 01:31:47.855000734 +0000 UTC m=+1.187229986" watchObservedRunningTime="2026-01-14 01:31:47.856048336 +0000 UTC m=+1.188277588" Jan 14 01:31:47.890724 kubelet[2924]: I0114 01:31:47.890669 2924 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4578-0-0-p-557efd55ff" podStartSLOduration=1.890652489 podStartE2EDuration="1.890652489s" podCreationTimestamp="2026-01-14 01:31:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-14 01:31:47.877052232 +0000 UTC m=+1.209281480" watchObservedRunningTime="2026-01-14 01:31:47.890652489 +0000 UTC m=+1.222881745" Jan 14 01:31:52.103192 kubelet[2924]: I0114 01:31:52.103142 2924 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Jan 14 01:31:52.103883 containerd[1684]: time="2026-01-14T01:31:52.103821016Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Jan 14 01:31:52.104208 kubelet[2924]: I0114 01:31:52.104190 2924 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Jan 14 01:31:52.799363 systemd[1]: Created slice kubepods-besteffort-podbd5cc7fd_b254_4379_9040_ff9c77e0e6be.slice - libcontainer container kubepods-besteffort-podbd5cc7fd_b254_4379_9040_ff9c77e0e6be.slice. Jan 14 01:31:52.881934 kubelet[2924]: I0114 01:31:52.881862 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/bd5cc7fd-b254-4379-9040-ff9c77e0e6be-xtables-lock\") pod \"kube-proxy-8pvvl\" (UID: \"bd5cc7fd-b254-4379-9040-ff9c77e0e6be\") " pod="kube-system/kube-proxy-8pvvl" Jan 14 01:31:52.881934 kubelet[2924]: I0114 01:31:52.881902 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/bd5cc7fd-b254-4379-9040-ff9c77e0e6be-kube-proxy\") pod \"kube-proxy-8pvvl\" (UID: \"bd5cc7fd-b254-4379-9040-ff9c77e0e6be\") " pod="kube-system/kube-proxy-8pvvl" Jan 14 01:31:52.882184 kubelet[2924]: I0114 01:31:52.881964 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bd5cc7fd-b254-4379-9040-ff9c77e0e6be-lib-modules\") pod \"kube-proxy-8pvvl\" (UID: \"bd5cc7fd-b254-4379-9040-ff9c77e0e6be\") " pod="kube-system/kube-proxy-8pvvl" Jan 14 01:31:52.882184 kubelet[2924]: I0114 01:31:52.882010 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqw9m\" (UniqueName: \"kubernetes.io/projected/bd5cc7fd-b254-4379-9040-ff9c77e0e6be-kube-api-access-lqw9m\") pod \"kube-proxy-8pvvl\" (UID: \"bd5cc7fd-b254-4379-9040-ff9c77e0e6be\") " pod="kube-system/kube-proxy-8pvvl" Jan 14 01:31:53.096796 systemd[1]: Created slice kubepods-besteffort-podd63518e5_af84_444e_9f27_a47ff2026ae3.slice - libcontainer container kubepods-besteffort-podd63518e5_af84_444e_9f27_a47ff2026ae3.slice. Jan 14 01:31:53.112259 containerd[1684]: time="2026-01-14T01:31:53.112230341Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-8pvvl,Uid:bd5cc7fd-b254-4379-9040-ff9c77e0e6be,Namespace:kube-system,Attempt:0,}" Jan 14 01:31:53.140094 containerd[1684]: time="2026-01-14T01:31:53.140011089Z" level=info msg="connecting to shim 1d63f9a299576723b3516f0bce422fec640509a4fc56c5254388ff09c55082fe" address="unix:///run/containerd/s/bce570d732bf4c8c73522f89b4e62f6ea4ef07cf9f50ff3f00c99c4adc96a412" namespace=k8s.io protocol=ttrpc version=3 Jan 14 01:31:53.168044 systemd[1]: Started cri-containerd-1d63f9a299576723b3516f0bce422fec640509a4fc56c5254388ff09c55082fe.scope - libcontainer container 1d63f9a299576723b3516f0bce422fec640509a4fc56c5254388ff09c55082fe. Jan 14 01:31:53.176000 audit: BPF prog-id=133 op=LOAD Jan 14 01:31:53.178110 kernel: kauditd_printk_skb: 164 callbacks suppressed Jan 14 01:31:53.178154 kernel: audit: type=1334 audit(1768354313.176:437): prog-id=133 op=LOAD Jan 14 01:31:53.179000 audit: BPF prog-id=134 op=LOAD Jan 14 01:31:53.179000 audit[2987]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000186238 a2=98 a3=0 items=0 ppid=2977 pid=2987 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.183063 kernel: audit: type=1334 audit(1768354313.179:438): prog-id=134 op=LOAD Jan 14 01:31:53.183105 kernel: audit: type=1300 audit(1768354313.179:438): arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000186238 a2=98 a3=0 items=0 ppid=2977 pid=2987 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.179000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3164363366396132393935373637323362333531366630626365343232 Jan 14 01:31:53.185933 kubelet[2924]: I0114 01:31:53.185757 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/d63518e5-af84-444e-9f27-a47ff2026ae3-var-lib-calico\") pod \"tigera-operator-7dcd859c48-gjw9x\" (UID: \"d63518e5-af84-444e-9f27-a47ff2026ae3\") " pod="tigera-operator/tigera-operator-7dcd859c48-gjw9x" Jan 14 01:31:53.185933 kubelet[2924]: I0114 01:31:53.185883 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4pzr\" (UniqueName: \"kubernetes.io/projected/d63518e5-af84-444e-9f27-a47ff2026ae3-kube-api-access-l4pzr\") pod \"tigera-operator-7dcd859c48-gjw9x\" (UID: \"d63518e5-af84-444e-9f27-a47ff2026ae3\") " pod="tigera-operator/tigera-operator-7dcd859c48-gjw9x" Jan 14 01:31:53.187179 kernel: audit: type=1327 audit(1768354313.179:438): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3164363366396132393935373637323362333531366630626365343232 Jan 14 01:31:53.179000 audit: BPF prog-id=134 op=UNLOAD Jan 14 01:31:53.190267 kernel: audit: type=1334 audit(1768354313.179:439): prog-id=134 op=UNLOAD Jan 14 01:31:53.179000 audit[2987]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2977 pid=2987 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.192477 kernel: audit: type=1300 audit(1768354313.179:439): arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2977 pid=2987 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.179000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3164363366396132393935373637323362333531366630626365343232 Jan 14 01:31:53.179000 audit: BPF prog-id=135 op=LOAD Jan 14 01:31:53.200434 kernel: audit: type=1327 audit(1768354313.179:439): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3164363366396132393935373637323362333531366630626365343232 Jan 14 01:31:53.200485 kernel: audit: type=1334 audit(1768354313.179:440): prog-id=135 op=LOAD Jan 14 01:31:53.179000 audit[2987]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000186488 a2=98 a3=0 items=0 ppid=2977 pid=2987 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.202178 kernel: audit: type=1300 audit(1768354313.179:440): arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000186488 a2=98 a3=0 items=0 ppid=2977 pid=2987 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.179000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3164363366396132393935373637323362333531366630626365343232 Jan 14 01:31:53.206185 kernel: audit: type=1327 audit(1768354313.179:440): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3164363366396132393935373637323362333531366630626365343232 Jan 14 01:31:53.179000 audit: BPF prog-id=136 op=LOAD Jan 14 01:31:53.179000 audit[2987]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000186218 a2=98 a3=0 items=0 ppid=2977 pid=2987 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.179000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3164363366396132393935373637323362333531366630626365343232 Jan 14 01:31:53.179000 audit: BPF prog-id=136 op=UNLOAD Jan 14 01:31:53.179000 audit[2987]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2977 pid=2987 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.179000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3164363366396132393935373637323362333531366630626365343232 Jan 14 01:31:53.179000 audit: BPF prog-id=135 op=UNLOAD Jan 14 01:31:53.179000 audit[2987]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2977 pid=2987 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.179000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3164363366396132393935373637323362333531366630626365343232 Jan 14 01:31:53.179000 audit: BPF prog-id=137 op=LOAD Jan 14 01:31:53.179000 audit[2987]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001866e8 a2=98 a3=0 items=0 ppid=2977 pid=2987 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.179000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3164363366396132393935373637323362333531366630626365343232 Jan 14 01:31:53.213226 containerd[1684]: time="2026-01-14T01:31:53.213163084Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-8pvvl,Uid:bd5cc7fd-b254-4379-9040-ff9c77e0e6be,Namespace:kube-system,Attempt:0,} returns sandbox id \"1d63f9a299576723b3516f0bce422fec640509a4fc56c5254388ff09c55082fe\"" Jan 14 01:31:53.216960 containerd[1684]: time="2026-01-14T01:31:53.216935639Z" level=info msg="CreateContainer within sandbox \"1d63f9a299576723b3516f0bce422fec640509a4fc56c5254388ff09c55082fe\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Jan 14 01:31:53.231576 containerd[1684]: time="2026-01-14T01:31:53.231539540Z" level=info msg="Container c6bc930816f35328a443579edad5c02d14dcbee7cc4dd60fc030b7d8fcf03ad5: CDI devices from CRI Config.CDIDevices: []" Jan 14 01:31:53.241701 containerd[1684]: time="2026-01-14T01:31:53.241670398Z" level=info msg="CreateContainer within sandbox \"1d63f9a299576723b3516f0bce422fec640509a4fc56c5254388ff09c55082fe\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"c6bc930816f35328a443579edad5c02d14dcbee7cc4dd60fc030b7d8fcf03ad5\"" Jan 14 01:31:53.242236 containerd[1684]: time="2026-01-14T01:31:53.242165177Z" level=info msg="StartContainer for \"c6bc930816f35328a443579edad5c02d14dcbee7cc4dd60fc030b7d8fcf03ad5\"" Jan 14 01:31:53.243656 containerd[1684]: time="2026-01-14T01:31:53.243636043Z" level=info msg="connecting to shim c6bc930816f35328a443579edad5c02d14dcbee7cc4dd60fc030b7d8fcf03ad5" address="unix:///run/containerd/s/bce570d732bf4c8c73522f89b4e62f6ea4ef07cf9f50ff3f00c99c4adc96a412" protocol=ttrpc version=3 Jan 14 01:31:53.266063 systemd[1]: Started cri-containerd-c6bc930816f35328a443579edad5c02d14dcbee7cc4dd60fc030b7d8fcf03ad5.scope - libcontainer container c6bc930816f35328a443579edad5c02d14dcbee7cc4dd60fc030b7d8fcf03ad5. Jan 14 01:31:53.312000 audit: BPF prog-id=138 op=LOAD Jan 14 01:31:53.312000 audit[3015]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000138488 a2=98 a3=0 items=0 ppid=2977 pid=3015 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.312000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6336626339333038313666333533323861343433353739656461643563 Jan 14 01:31:53.312000 audit: BPF prog-id=139 op=LOAD Jan 14 01:31:53.312000 audit[3015]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000138218 a2=98 a3=0 items=0 ppid=2977 pid=3015 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.312000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6336626339333038313666333533323861343433353739656461643563 Jan 14 01:31:53.312000 audit: BPF prog-id=139 op=UNLOAD Jan 14 01:31:53.312000 audit[3015]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2977 pid=3015 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.312000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6336626339333038313666333533323861343433353739656461643563 Jan 14 01:31:53.312000 audit: BPF prog-id=138 op=UNLOAD Jan 14 01:31:53.312000 audit[3015]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2977 pid=3015 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.312000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6336626339333038313666333533323861343433353739656461643563 Jan 14 01:31:53.312000 audit: BPF prog-id=140 op=LOAD Jan 14 01:31:53.312000 audit[3015]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001386e8 a2=98 a3=0 items=0 ppid=2977 pid=3015 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.312000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6336626339333038313666333533323861343433353739656461643563 Jan 14 01:31:53.330019 containerd[1684]: time="2026-01-14T01:31:53.329984779Z" level=info msg="StartContainer for \"c6bc930816f35328a443579edad5c02d14dcbee7cc4dd60fc030b7d8fcf03ad5\" returns successfully" Jan 14 01:31:53.408094 containerd[1684]: time="2026-01-14T01:31:53.407862812Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-gjw9x,Uid:d63518e5-af84-444e-9f27-a47ff2026ae3,Namespace:tigera-operator,Attempt:0,}" Jan 14 01:31:53.433006 containerd[1684]: time="2026-01-14T01:31:53.432932733Z" level=info msg="connecting to shim 2aabc73ce5ac253664affec3bd413335d4c368cc331f327529143dcd8fe9fbe9" address="unix:///run/containerd/s/18a1b34cd3f078a2bdee90eb3eb3d846643daeecc4b11bff748e6d0e041b0916" namespace=k8s.io protocol=ttrpc version=3 Jan 14 01:31:53.438000 audit[3096]: NETFILTER_CFG table=mangle:54 family=2 entries=1 op=nft_register_chain pid=3096 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:31:53.441000 audit[3097]: NETFILTER_CFG table=mangle:55 family=10 entries=1 op=nft_register_chain pid=3097 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:31:53.438000 audit[3096]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7fff60a2c930 a2=0 a3=7fff60a2c91c items=0 ppid=3028 pid=3096 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.438000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Jan 14 01:31:53.441000 audit[3097]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffc01f91060 a2=0 a3=7ffc01f9104c items=0 ppid=3028 pid=3097 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.441000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Jan 14 01:31:53.455000 audit[3103]: NETFILTER_CFG table=nat:56 family=2 entries=1 op=nft_register_chain pid=3103 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:31:53.455000 audit[3103]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc276a6870 a2=0 a3=7ffc276a685c items=0 ppid=3028 pid=3103 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.455000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Jan 14 01:31:53.457000 audit[3104]: NETFILTER_CFG table=nat:57 family=10 entries=1 op=nft_register_chain pid=3104 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:31:53.457000 audit[3104]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe33cf10c0 a2=0 a3=7ffe33cf10ac items=0 ppid=3028 pid=3104 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.457000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Jan 14 01:31:53.458000 audit[3105]: NETFILTER_CFG table=filter:58 family=2 entries=1 op=nft_register_chain pid=3105 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:31:53.458000 audit[3105]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffc7ab7bb40 a2=0 a3=7ffc7ab7bb2c items=0 ppid=3028 pid=3105 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.458000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Jan 14 01:31:53.459000 audit[3111]: NETFILTER_CFG table=filter:59 family=10 entries=1 op=nft_register_chain pid=3111 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:31:53.459000 audit[3111]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7fff1ea97120 a2=0 a3=7fff1ea9710c items=0 ppid=3028 pid=3111 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.459000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Jan 14 01:31:53.471132 systemd[1]: Started cri-containerd-2aabc73ce5ac253664affec3bd413335d4c368cc331f327529143dcd8fe9fbe9.scope - libcontainer container 2aabc73ce5ac253664affec3bd413335d4c368cc331f327529143dcd8fe9fbe9. Jan 14 01:31:53.480000 audit: BPF prog-id=141 op=LOAD Jan 14 01:31:53.481000 audit: BPF prog-id=142 op=LOAD Jan 14 01:31:53.481000 audit[3099]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=3084 pid=3099 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.481000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3261616263373363653561633235333636346166666563336264343133 Jan 14 01:31:53.481000 audit: BPF prog-id=142 op=UNLOAD Jan 14 01:31:53.481000 audit[3099]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3084 pid=3099 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.481000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3261616263373363653561633235333636346166666563336264343133 Jan 14 01:31:53.481000 audit: BPF prog-id=143 op=LOAD Jan 14 01:31:53.481000 audit[3099]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=3084 pid=3099 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.481000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3261616263373363653561633235333636346166666563336264343133 Jan 14 01:31:53.481000 audit: BPF prog-id=144 op=LOAD Jan 14 01:31:53.481000 audit[3099]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=3084 pid=3099 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.481000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3261616263373363653561633235333636346166666563336264343133 Jan 14 01:31:53.481000 audit: BPF prog-id=144 op=UNLOAD Jan 14 01:31:53.481000 audit[3099]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3084 pid=3099 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.481000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3261616263373363653561633235333636346166666563336264343133 Jan 14 01:31:53.481000 audit: BPF prog-id=143 op=UNLOAD Jan 14 01:31:53.481000 audit[3099]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3084 pid=3099 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.481000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3261616263373363653561633235333636346166666563336264343133 Jan 14 01:31:53.481000 audit: BPF prog-id=145 op=LOAD Jan 14 01:31:53.481000 audit[3099]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=3084 pid=3099 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.481000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3261616263373363653561633235333636346166666563336264343133 Jan 14 01:31:53.517247 containerd[1684]: time="2026-01-14T01:31:53.517194849Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-gjw9x,Uid:d63518e5-af84-444e-9f27-a47ff2026ae3,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"2aabc73ce5ac253664affec3bd413335d4c368cc331f327529143dcd8fe9fbe9\"" Jan 14 01:31:53.519659 containerd[1684]: time="2026-01-14T01:31:53.519559771Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\"" Jan 14 01:31:53.542000 audit[3134]: NETFILTER_CFG table=filter:60 family=2 entries=1 op=nft_register_chain pid=3134 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:31:53.542000 audit[3134]: SYSCALL arch=c000003e syscall=46 success=yes exit=108 a0=3 a1=7ffc2e2d4d60 a2=0 a3=7ffc2e2d4d4c items=0 ppid=3028 pid=3134 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.542000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Jan 14 01:31:53.545000 audit[3136]: NETFILTER_CFG table=filter:61 family=2 entries=1 op=nft_register_rule pid=3136 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:31:53.545000 audit[3136]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7ffdc0b99bf0 a2=0 a3=7ffdc0b99bdc items=0 ppid=3028 pid=3136 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.545000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276696365 Jan 14 01:31:53.553000 audit[3139]: NETFILTER_CFG table=filter:62 family=2 entries=1 op=nft_register_rule pid=3139 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:31:53.553000 audit[3139]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7ffdb5e26b20 a2=0 a3=7ffdb5e26b0c items=0 ppid=3028 pid=3139 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.553000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669 Jan 14 01:31:53.555000 audit[3140]: NETFILTER_CFG table=filter:63 family=2 entries=1 op=nft_register_chain pid=3140 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:31:53.555000 audit[3140]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffdfe16bae0 a2=0 a3=7ffdfe16bacc items=0 ppid=3028 pid=3140 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.555000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Jan 14 01:31:53.557000 audit[3142]: NETFILTER_CFG table=filter:64 family=2 entries=1 op=nft_register_rule pid=3142 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:31:53.557000 audit[3142]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7fffef4099e0 a2=0 a3=7fffef4099cc items=0 ppid=3028 pid=3142 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.557000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Jan 14 01:31:53.559000 audit[3143]: NETFILTER_CFG table=filter:65 family=2 entries=1 op=nft_register_chain pid=3143 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:31:53.559000 audit[3143]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff82b9ad10 a2=0 a3=7fff82b9acfc items=0 ppid=3028 pid=3143 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.559000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Jan 14 01:31:53.561000 audit[3145]: NETFILTER_CFG table=filter:66 family=2 entries=1 op=nft_register_rule pid=3145 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:31:53.561000 audit[3145]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7ffe818c8f40 a2=0 a3=7ffe818c8f2c items=0 ppid=3028 pid=3145 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.561000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Jan 14 01:31:53.565000 audit[3148]: NETFILTER_CFG table=filter:67 family=2 entries=1 op=nft_register_rule pid=3148 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:31:53.565000 audit[3148]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7ffe41914ca0 a2=0 a3=7ffe41914c8c items=0 ppid=3028 pid=3148 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.565000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D53 Jan 14 01:31:53.566000 audit[3149]: NETFILTER_CFG table=filter:68 family=2 entries=1 op=nft_register_chain pid=3149 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:31:53.566000 audit[3149]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fffc129aaf0 a2=0 a3=7fffc129aadc items=0 ppid=3028 pid=3149 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.566000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Jan 14 01:31:53.569000 audit[3151]: NETFILTER_CFG table=filter:69 family=2 entries=1 op=nft_register_rule pid=3151 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:31:53.569000 audit[3151]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7fffd23d6fb0 a2=0 a3=7fffd23d6f9c items=0 ppid=3028 pid=3151 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.569000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Jan 14 01:31:53.570000 audit[3152]: NETFILTER_CFG table=filter:70 family=2 entries=1 op=nft_register_chain pid=3152 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:31:53.570000 audit[3152]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffc59cd96b0 a2=0 a3=7ffc59cd969c items=0 ppid=3028 pid=3152 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.570000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Jan 14 01:31:53.573000 audit[3154]: NETFILTER_CFG table=filter:71 family=2 entries=1 op=nft_register_rule pid=3154 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:31:53.573000 audit[3154]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffe8de062f0 a2=0 a3=7ffe8de062dc items=0 ppid=3028 pid=3154 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.573000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Jan 14 01:31:53.576000 audit[3157]: NETFILTER_CFG table=filter:72 family=2 entries=1 op=nft_register_rule pid=3157 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:31:53.576000 audit[3157]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffce82fec80 a2=0 a3=7ffce82fec6c items=0 ppid=3028 pid=3157 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.576000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Jan 14 01:31:53.580000 audit[3160]: NETFILTER_CFG table=filter:73 family=2 entries=1 op=nft_register_rule pid=3160 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:31:53.580000 audit[3160]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffe704bed20 a2=0 a3=7ffe704bed0c items=0 ppid=3028 pid=3160 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.580000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Jan 14 01:31:53.581000 audit[3161]: NETFILTER_CFG table=nat:74 family=2 entries=1 op=nft_register_chain pid=3161 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:31:53.581000 audit[3161]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffc71ec5a00 a2=0 a3=7ffc71ec59ec items=0 ppid=3028 pid=3161 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.581000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Jan 14 01:31:53.583000 audit[3163]: NETFILTER_CFG table=nat:75 family=2 entries=1 op=nft_register_rule pid=3163 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:31:53.583000 audit[3163]: SYSCALL arch=c000003e syscall=46 success=yes exit=524 a0=3 a1=7ffdaa5a3000 a2=0 a3=7ffdaa5a2fec items=0 ppid=3028 pid=3163 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.583000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 14 01:31:53.587000 audit[3166]: NETFILTER_CFG table=nat:76 family=2 entries=1 op=nft_register_rule pid=3166 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:31:53.587000 audit[3166]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffe35557b30 a2=0 a3=7ffe35557b1c items=0 ppid=3028 pid=3166 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.587000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 14 01:31:53.588000 audit[3167]: NETFILTER_CFG table=nat:77 family=2 entries=1 op=nft_register_chain pid=3167 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:31:53.588000 audit[3167]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffee8f0e610 a2=0 a3=7ffee8f0e5fc items=0 ppid=3028 pid=3167 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.588000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Jan 14 01:31:53.590000 audit[3169]: NETFILTER_CFG table=nat:78 family=2 entries=1 op=nft_register_rule pid=3169 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:31:53.590000 audit[3169]: SYSCALL arch=c000003e syscall=46 success=yes exit=532 a0=3 a1=7ffff3a634a0 a2=0 a3=7ffff3a6348c items=0 ppid=3028 pid=3169 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.590000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Jan 14 01:31:53.614000 audit[3175]: NETFILTER_CFG table=filter:79 family=2 entries=8 op=nft_register_rule pid=3175 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:31:53.614000 audit[3175]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7fffe93a8a90 a2=0 a3=7fffe93a8a7c items=0 ppid=3028 pid=3175 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.614000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:31:53.621000 audit[3175]: NETFILTER_CFG table=nat:80 family=2 entries=14 op=nft_register_chain pid=3175 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:31:53.621000 audit[3175]: SYSCALL arch=c000003e syscall=46 success=yes exit=5508 a0=3 a1=7fffe93a8a90 a2=0 a3=7fffe93a8a7c items=0 ppid=3028 pid=3175 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.621000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:31:53.622000 audit[3180]: NETFILTER_CFG table=filter:81 family=10 entries=1 op=nft_register_chain pid=3180 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:31:53.622000 audit[3180]: SYSCALL arch=c000003e syscall=46 success=yes exit=108 a0=3 a1=7ffe87eef1f0 a2=0 a3=7ffe87eef1dc items=0 ppid=3028 pid=3180 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.622000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Jan 14 01:31:53.625000 audit[3182]: NETFILTER_CFG table=filter:82 family=10 entries=2 op=nft_register_chain pid=3182 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:31:53.625000 audit[3182]: SYSCALL arch=c000003e syscall=46 success=yes exit=836 a0=3 a1=7ffc4fba77c0 a2=0 a3=7ffc4fba77ac items=0 ppid=3028 pid=3182 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.625000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C6520736572766963 Jan 14 01:31:53.629000 audit[3185]: NETFILTER_CFG table=filter:83 family=10 entries=1 op=nft_register_rule pid=3185 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:31:53.629000 audit[3185]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7ffff79b76c0 a2=0 a3=7ffff79b76ac items=0 ppid=3028 pid=3185 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.629000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276 Jan 14 01:31:53.630000 audit[3186]: NETFILTER_CFG table=filter:84 family=10 entries=1 op=nft_register_chain pid=3186 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:31:53.630000 audit[3186]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe235fa780 a2=0 a3=7ffe235fa76c items=0 ppid=3028 pid=3186 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.630000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Jan 14 01:31:53.633000 audit[3188]: NETFILTER_CFG table=filter:85 family=10 entries=1 op=nft_register_rule pid=3188 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:31:53.633000 audit[3188]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffca8c599c0 a2=0 a3=7ffca8c599ac items=0 ppid=3028 pid=3188 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.633000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Jan 14 01:31:53.634000 audit[3189]: NETFILTER_CFG table=filter:86 family=10 entries=1 op=nft_register_chain pid=3189 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:31:53.634000 audit[3189]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fffb2765140 a2=0 a3=7fffb276512c items=0 ppid=3028 pid=3189 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.634000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Jan 14 01:31:53.636000 audit[3191]: NETFILTER_CFG table=filter:87 family=10 entries=1 op=nft_register_rule pid=3191 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:31:53.636000 audit[3191]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7ffd088d5130 a2=0 a3=7ffd088d511c items=0 ppid=3028 pid=3191 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.636000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B554245 Jan 14 01:31:53.639000 audit[3194]: NETFILTER_CFG table=filter:88 family=10 entries=2 op=nft_register_chain pid=3194 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:31:53.639000 audit[3194]: SYSCALL arch=c000003e syscall=46 success=yes exit=828 a0=3 a1=7ffdcaadc850 a2=0 a3=7ffdcaadc83c items=0 ppid=3028 pid=3194 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.639000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Jan 14 01:31:53.640000 audit[3195]: NETFILTER_CFG table=filter:89 family=10 entries=1 op=nft_register_chain pid=3195 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:31:53.640000 audit[3195]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffce8162830 a2=0 a3=7ffce816281c items=0 ppid=3028 pid=3195 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.640000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Jan 14 01:31:53.643000 audit[3197]: NETFILTER_CFG table=filter:90 family=10 entries=1 op=nft_register_rule pid=3197 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:31:53.643000 audit[3197]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffd6b87ff30 a2=0 a3=7ffd6b87ff1c items=0 ppid=3028 pid=3197 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.643000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Jan 14 01:31:53.644000 audit[3198]: NETFILTER_CFG table=filter:91 family=10 entries=1 op=nft_register_chain pid=3198 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:31:53.644000 audit[3198]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7fffa7d358a0 a2=0 a3=7fffa7d3588c items=0 ppid=3028 pid=3198 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.644000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Jan 14 01:31:53.646000 audit[3200]: NETFILTER_CFG table=filter:92 family=10 entries=1 op=nft_register_rule pid=3200 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:31:53.646000 audit[3200]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffdde2a6280 a2=0 a3=7ffdde2a626c items=0 ppid=3028 pid=3200 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.646000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Jan 14 01:31:53.650000 audit[3203]: NETFILTER_CFG table=filter:93 family=10 entries=1 op=nft_register_rule pid=3203 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:31:53.650000 audit[3203]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7fff7ddbf1b0 a2=0 a3=7fff7ddbf19c items=0 ppid=3028 pid=3203 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.650000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Jan 14 01:31:53.653000 audit[3206]: NETFILTER_CFG table=filter:94 family=10 entries=1 op=nft_register_rule pid=3206 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:31:53.653000 audit[3206]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffd0d813a90 a2=0 a3=7ffd0d813a7c items=0 ppid=3028 pid=3206 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.653000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C Jan 14 01:31:53.654000 audit[3207]: NETFILTER_CFG table=nat:95 family=10 entries=1 op=nft_register_chain pid=3207 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:31:53.654000 audit[3207]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7fff5645f6b0 a2=0 a3=7fff5645f69c items=0 ppid=3028 pid=3207 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.654000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Jan 14 01:31:53.657000 audit[3209]: NETFILTER_CFG table=nat:96 family=10 entries=1 op=nft_register_rule pid=3209 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:31:53.657000 audit[3209]: SYSCALL arch=c000003e syscall=46 success=yes exit=524 a0=3 a1=7ffe714f1170 a2=0 a3=7ffe714f115c items=0 ppid=3028 pid=3209 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.657000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 14 01:31:53.661000 audit[3212]: NETFILTER_CFG table=nat:97 family=10 entries=1 op=nft_register_rule pid=3212 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:31:53.661000 audit[3212]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffd6c378ee0 a2=0 a3=7ffd6c378ecc items=0 ppid=3028 pid=3212 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.661000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 14 01:31:53.662000 audit[3213]: NETFILTER_CFG table=nat:98 family=10 entries=1 op=nft_register_chain pid=3213 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:31:53.662000 audit[3213]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe482f5500 a2=0 a3=7ffe482f54ec items=0 ppid=3028 pid=3213 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.662000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Jan 14 01:31:53.665000 audit[3215]: NETFILTER_CFG table=nat:99 family=10 entries=2 op=nft_register_chain pid=3215 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:31:53.665000 audit[3215]: SYSCALL arch=c000003e syscall=46 success=yes exit=612 a0=3 a1=7ffd7b941a80 a2=0 a3=7ffd7b941a6c items=0 ppid=3028 pid=3215 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.665000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Jan 14 01:31:53.666000 audit[3216]: NETFILTER_CFG table=filter:100 family=10 entries=1 op=nft_register_chain pid=3216 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:31:53.666000 audit[3216]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe4bfa6c80 a2=0 a3=7ffe4bfa6c6c items=0 ppid=3028 pid=3216 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.666000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Jan 14 01:31:53.668000 audit[3218]: NETFILTER_CFG table=filter:101 family=10 entries=1 op=nft_register_rule pid=3218 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:31:53.668000 audit[3218]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7ffeddf20590 a2=0 a3=7ffeddf2057c items=0 ppid=3028 pid=3218 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.668000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 14 01:31:53.671000 audit[3221]: NETFILTER_CFG table=filter:102 family=10 entries=1 op=nft_register_rule pid=3221 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:31:53.671000 audit[3221]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7ffe2680c600 a2=0 a3=7ffe2680c5ec items=0 ppid=3028 pid=3221 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.671000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 14 01:31:53.677000 audit[3223]: NETFILTER_CFG table=filter:103 family=10 entries=3 op=nft_register_rule pid=3223 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Jan 14 01:31:53.677000 audit[3223]: SYSCALL arch=c000003e syscall=46 success=yes exit=2088 a0=3 a1=7ffd164af920 a2=0 a3=7ffd164af90c items=0 ppid=3028 pid=3223 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.677000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:31:53.677000 audit[3223]: NETFILTER_CFG table=nat:104 family=10 entries=7 op=nft_register_chain pid=3223 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Jan 14 01:31:53.677000 audit[3223]: SYSCALL arch=c000003e syscall=46 success=yes exit=2056 a0=3 a1=7ffd164af920 a2=0 a3=7ffd164af90c items=0 ppid=3028 pid=3223 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:53.677000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:31:53.810399 kubelet[2924]: I0114 01:31:53.810322 2924 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-8pvvl" podStartSLOduration=1.810281916 podStartE2EDuration="1.810281916s" podCreationTimestamp="2026-01-14 01:31:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-14 01:31:53.810229424 +0000 UTC m=+7.142458732" watchObservedRunningTime="2026-01-14 01:31:53.810281916 +0000 UTC m=+7.142511185" Jan 14 01:31:55.475888 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1662058663.mount: Deactivated successfully. Jan 14 01:31:55.892063 containerd[1684]: time="2026-01-14T01:31:55.891473126Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:31:55.893818 containerd[1684]: time="2026-01-14T01:31:55.893791618Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.7: active requests=0, bytes read=1494743" Jan 14 01:31:55.895497 containerd[1684]: time="2026-01-14T01:31:55.895450164Z" level=info msg="ImageCreate event name:\"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:31:55.898446 containerd[1684]: time="2026-01-14T01:31:55.898301652Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:31:55.898753 containerd[1684]: time="2026-01-14T01:31:55.898734900Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.7\" with image id \"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\", repo tag \"quay.io/tigera/operator:v1.38.7\", repo digest \"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\", size \"25057686\" in 2.379069982s" Jan 14 01:31:55.898795 containerd[1684]: time="2026-01-14T01:31:55.898759462Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\" returns image reference \"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\"" Jan 14 01:31:55.902185 containerd[1684]: time="2026-01-14T01:31:55.902155557Z" level=info msg="CreateContainer within sandbox \"2aabc73ce5ac253664affec3bd413335d4c368cc331f327529143dcd8fe9fbe9\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Jan 14 01:31:55.924105 containerd[1684]: time="2026-01-14T01:31:55.924017156Z" level=info msg="Container 3649c6091c17dc95dac3d5b904856065a3e0232583e07c3da1815ca8e7bf266a: CDI devices from CRI Config.CDIDevices: []" Jan 14 01:31:55.932306 containerd[1684]: time="2026-01-14T01:31:55.932271255Z" level=info msg="CreateContainer within sandbox \"2aabc73ce5ac253664affec3bd413335d4c368cc331f327529143dcd8fe9fbe9\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"3649c6091c17dc95dac3d5b904856065a3e0232583e07c3da1815ca8e7bf266a\"" Jan 14 01:31:55.932864 containerd[1684]: time="2026-01-14T01:31:55.932838750Z" level=info msg="StartContainer for \"3649c6091c17dc95dac3d5b904856065a3e0232583e07c3da1815ca8e7bf266a\"" Jan 14 01:31:55.934906 containerd[1684]: time="2026-01-14T01:31:55.934831666Z" level=info msg="connecting to shim 3649c6091c17dc95dac3d5b904856065a3e0232583e07c3da1815ca8e7bf266a" address="unix:///run/containerd/s/18a1b34cd3f078a2bdee90eb3eb3d846643daeecc4b11bff748e6d0e041b0916" protocol=ttrpc version=3 Jan 14 01:31:55.956156 systemd[1]: Started cri-containerd-3649c6091c17dc95dac3d5b904856065a3e0232583e07c3da1815ca8e7bf266a.scope - libcontainer container 3649c6091c17dc95dac3d5b904856065a3e0232583e07c3da1815ca8e7bf266a. Jan 14 01:31:55.965000 audit: BPF prog-id=146 op=LOAD Jan 14 01:31:55.966000 audit: BPF prog-id=147 op=LOAD Jan 14 01:31:55.966000 audit[3232]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=3084 pid=3232 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:55.966000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3336343963363039316331376463393564616333643562393034383536 Jan 14 01:31:55.966000 audit: BPF prog-id=147 op=UNLOAD Jan 14 01:31:55.966000 audit[3232]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3084 pid=3232 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:55.966000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3336343963363039316331376463393564616333643562393034383536 Jan 14 01:31:55.966000 audit: BPF prog-id=148 op=LOAD Jan 14 01:31:55.966000 audit[3232]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=3084 pid=3232 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:55.966000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3336343963363039316331376463393564616333643562393034383536 Jan 14 01:31:55.966000 audit: BPF prog-id=149 op=LOAD Jan 14 01:31:55.966000 audit[3232]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=3084 pid=3232 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:55.966000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3336343963363039316331376463393564616333643562393034383536 Jan 14 01:31:55.966000 audit: BPF prog-id=149 op=UNLOAD Jan 14 01:31:55.966000 audit[3232]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3084 pid=3232 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:55.966000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3336343963363039316331376463393564616333643562393034383536 Jan 14 01:31:55.966000 audit: BPF prog-id=148 op=UNLOAD Jan 14 01:31:55.966000 audit[3232]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3084 pid=3232 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:55.966000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3336343963363039316331376463393564616333643562393034383536 Jan 14 01:31:55.966000 audit: BPF prog-id=150 op=LOAD Jan 14 01:31:55.966000 audit[3232]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=3084 pid=3232 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:31:55.966000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3336343963363039316331376463393564616333643562393034383536 Jan 14 01:31:55.984879 containerd[1684]: time="2026-01-14T01:31:55.984792930Z" level=info msg="StartContainer for \"3649c6091c17dc95dac3d5b904856065a3e0232583e07c3da1815ca8e7bf266a\" returns successfully" Jan 14 01:31:56.847910 kubelet[2924]: I0114 01:31:56.847793 2924 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-7dcd859c48-gjw9x" podStartSLOduration=1.466637223 podStartE2EDuration="3.847766436s" podCreationTimestamp="2026-01-14 01:31:53 +0000 UTC" firstStartedPulling="2026-01-14 01:31:53.518632012 +0000 UTC m=+6.850861242" lastFinishedPulling="2026-01-14 01:31:55.899761225 +0000 UTC m=+9.231990455" observedRunningTime="2026-01-14 01:31:56.818012634 +0000 UTC m=+10.150241886" watchObservedRunningTime="2026-01-14 01:31:56.847766436 +0000 UTC m=+10.179995720" Jan 14 01:32:01.416863 sudo[1965]: pam_unix(sudo:session): session closed for user root Jan 14 01:32:01.416000 audit[1965]: USER_END pid=1965 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 14 01:32:01.418119 kernel: kauditd_printk_skb: 224 callbacks suppressed Jan 14 01:32:01.418188 kernel: audit: type=1106 audit(1768354321.416:517): pid=1965 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 14 01:32:01.420000 audit[1965]: CRED_DISP pid=1965 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 14 01:32:01.427867 kernel: audit: type=1104 audit(1768354321.420:518): pid=1965 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 14 01:32:01.524014 sshd[1964]: Connection closed by 68.220.241.50 port 45064 Jan 14 01:32:01.526007 sshd-session[1960]: pam_unix(sshd:session): session closed for user core Jan 14 01:32:01.527000 audit[1960]: USER_END pid=1960 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:32:01.535871 kernel: audit: type=1106 audit(1768354321.527:519): pid=1960 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:32:01.535807 systemd[1]: sshd@8-10.0.22.183:22-68.220.241.50:45064.service: Deactivated successfully. Jan 14 01:32:01.527000 audit[1960]: CRED_DISP pid=1960 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:32:01.538832 systemd[1]: session-10.scope: Deactivated successfully. Jan 14 01:32:01.540209 systemd[1]: session-10.scope: Consumed 5.325s CPU time, 231.4M memory peak. Jan 14 01:32:01.540884 kernel: audit: type=1104 audit(1768354321.527:520): pid=1960 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:32:01.535000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.0.22.183:22-68.220.241.50:45064 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:32:01.541739 systemd-logind[1652]: Session 10 logged out. Waiting for processes to exit. Jan 14 01:32:01.545981 kernel: audit: type=1131 audit(1768354321.535:521): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.0.22.183:22-68.220.241.50:45064 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:32:01.547270 systemd-logind[1652]: Removed session 10. Jan 14 01:32:02.134950 kernel: audit: type=1325 audit(1768354322.104:522): table=filter:105 family=2 entries=15 op=nft_register_rule pid=3314 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:32:02.104000 audit[3314]: NETFILTER_CFG table=filter:105 family=2 entries=15 op=nft_register_rule pid=3314 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:32:02.104000 audit[3314]: SYSCALL arch=c000003e syscall=46 success=yes exit=5992 a0=3 a1=7ffe09d1dc80 a2=0 a3=7ffe09d1dc6c items=0 ppid=3028 pid=3314 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:02.141877 kernel: audit: type=1300 audit(1768354322.104:522): arch=c000003e syscall=46 success=yes exit=5992 a0=3 a1=7ffe09d1dc80 a2=0 a3=7ffe09d1dc6c items=0 ppid=3028 pid=3314 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:02.104000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:32:02.145870 kernel: audit: type=1327 audit(1768354322.104:522): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:32:02.136000 audit[3314]: NETFILTER_CFG table=nat:106 family=2 entries=12 op=nft_register_rule pid=3314 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:32:02.150890 kernel: audit: type=1325 audit(1768354322.136:523): table=nat:106 family=2 entries=12 op=nft_register_rule pid=3314 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:32:02.136000 audit[3314]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffe09d1dc80 a2=0 a3=0 items=0 ppid=3028 pid=3314 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:02.157865 kernel: audit: type=1300 audit(1768354322.136:523): arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffe09d1dc80 a2=0 a3=0 items=0 ppid=3028 pid=3314 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:02.136000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:32:02.182000 audit[3316]: NETFILTER_CFG table=filter:107 family=2 entries=16 op=nft_register_rule pid=3316 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:32:02.182000 audit[3316]: SYSCALL arch=c000003e syscall=46 success=yes exit=5992 a0=3 a1=7ffc1423a490 a2=0 a3=7ffc1423a47c items=0 ppid=3028 pid=3316 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:02.182000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:32:02.188000 audit[3316]: NETFILTER_CFG table=nat:108 family=2 entries=12 op=nft_register_rule pid=3316 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:32:02.188000 audit[3316]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffc1423a490 a2=0 a3=0 items=0 ppid=3028 pid=3316 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:02.188000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:32:04.153000 audit[3319]: NETFILTER_CFG table=filter:109 family=2 entries=17 op=nft_register_rule pid=3319 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:32:04.153000 audit[3319]: SYSCALL arch=c000003e syscall=46 success=yes exit=6736 a0=3 a1=7ffdb8cdf3f0 a2=0 a3=7ffdb8cdf3dc items=0 ppid=3028 pid=3319 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:04.153000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:32:04.157000 audit[3319]: NETFILTER_CFG table=nat:110 family=2 entries=12 op=nft_register_rule pid=3319 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:32:04.157000 audit[3319]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffdb8cdf3f0 a2=0 a3=0 items=0 ppid=3028 pid=3319 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:04.157000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:32:04.168000 audit[3321]: NETFILTER_CFG table=filter:111 family=2 entries=18 op=nft_register_rule pid=3321 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:32:04.168000 audit[3321]: SYSCALL arch=c000003e syscall=46 success=yes exit=6736 a0=3 a1=7ffef0de17d0 a2=0 a3=7ffef0de17bc items=0 ppid=3028 pid=3321 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:04.168000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:32:04.173000 audit[3321]: NETFILTER_CFG table=nat:112 family=2 entries=12 op=nft_register_rule pid=3321 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:32:04.173000 audit[3321]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffef0de17d0 a2=0 a3=0 items=0 ppid=3028 pid=3321 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:04.173000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:32:05.197000 audit[3323]: NETFILTER_CFG table=filter:113 family=2 entries=19 op=nft_register_rule pid=3323 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:32:05.197000 audit[3323]: SYSCALL arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7ffed827c0f0 a2=0 a3=7ffed827c0dc items=0 ppid=3028 pid=3323 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:05.197000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:32:05.204000 audit[3323]: NETFILTER_CFG table=nat:114 family=2 entries=12 op=nft_register_rule pid=3323 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:32:05.204000 audit[3323]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffed827c0f0 a2=0 a3=0 items=0 ppid=3028 pid=3323 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:05.204000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:32:05.898434 systemd[1]: Created slice kubepods-besteffort-podd69b0aa0_16c1_4c29_86b5_424d99c12b4b.slice - libcontainer container kubepods-besteffort-podd69b0aa0_16c1_4c29_86b5_424d99c12b4b.slice. Jan 14 01:32:05.975365 kubelet[2924]: I0114 01:32:05.975260 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/d69b0aa0-16c1-4c29-86b5-424d99c12b4b-typha-certs\") pod \"calico-typha-69fb88f466-8zm7j\" (UID: \"d69b0aa0-16c1-4c29-86b5-424d99c12b4b\") " pod="calico-system/calico-typha-69fb88f466-8zm7j" Jan 14 01:32:05.975365 kubelet[2924]: I0114 01:32:05.975297 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-242bf\" (UniqueName: \"kubernetes.io/projected/d69b0aa0-16c1-4c29-86b5-424d99c12b4b-kube-api-access-242bf\") pod \"calico-typha-69fb88f466-8zm7j\" (UID: \"d69b0aa0-16c1-4c29-86b5-424d99c12b4b\") " pod="calico-system/calico-typha-69fb88f466-8zm7j" Jan 14 01:32:05.975365 kubelet[2924]: I0114 01:32:05.975317 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d69b0aa0-16c1-4c29-86b5-424d99c12b4b-tigera-ca-bundle\") pod \"calico-typha-69fb88f466-8zm7j\" (UID: \"d69b0aa0-16c1-4c29-86b5-424d99c12b4b\") " pod="calico-system/calico-typha-69fb88f466-8zm7j" Jan 14 01:32:06.094162 systemd[1]: Created slice kubepods-besteffort-podeed9480f_a11e_489d_85cf_16bcfab6aa4c.slice - libcontainer container kubepods-besteffort-podeed9480f_a11e_489d_85cf_16bcfab6aa4c.slice. Jan 14 01:32:06.176685 kubelet[2924]: I0114 01:32:06.176571 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/eed9480f-a11e-489d-85cf-16bcfab6aa4c-xtables-lock\") pod \"calico-node-46xxr\" (UID: \"eed9480f-a11e-489d-85cf-16bcfab6aa4c\") " pod="calico-system/calico-node-46xxr" Jan 14 01:32:06.176685 kubelet[2924]: I0114 01:32:06.176614 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvvwq\" (UniqueName: \"kubernetes.io/projected/eed9480f-a11e-489d-85cf-16bcfab6aa4c-kube-api-access-bvvwq\") pod \"calico-node-46xxr\" (UID: \"eed9480f-a11e-489d-85cf-16bcfab6aa4c\") " pod="calico-system/calico-node-46xxr" Jan 14 01:32:06.176685 kubelet[2924]: I0114 01:32:06.176631 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/eed9480f-a11e-489d-85cf-16bcfab6aa4c-flexvol-driver-host\") pod \"calico-node-46xxr\" (UID: \"eed9480f-a11e-489d-85cf-16bcfab6aa4c\") " pod="calico-system/calico-node-46xxr" Jan 14 01:32:06.176685 kubelet[2924]: I0114 01:32:06.176645 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eed9480f-a11e-489d-85cf-16bcfab6aa4c-tigera-ca-bundle\") pod \"calico-node-46xxr\" (UID: \"eed9480f-a11e-489d-85cf-16bcfab6aa4c\") " pod="calico-system/calico-node-46xxr" Jan 14 01:32:06.176685 kubelet[2924]: I0114 01:32:06.176663 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/eed9480f-a11e-489d-85cf-16bcfab6aa4c-cni-net-dir\") pod \"calico-node-46xxr\" (UID: \"eed9480f-a11e-489d-85cf-16bcfab6aa4c\") " pod="calico-system/calico-node-46xxr" Jan 14 01:32:06.176909 kubelet[2924]: I0114 01:32:06.176684 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/eed9480f-a11e-489d-85cf-16bcfab6aa4c-policysync\") pod \"calico-node-46xxr\" (UID: \"eed9480f-a11e-489d-85cf-16bcfab6aa4c\") " pod="calico-system/calico-node-46xxr" Jan 14 01:32:06.176909 kubelet[2924]: I0114 01:32:06.176698 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/eed9480f-a11e-489d-85cf-16bcfab6aa4c-var-lib-calico\") pod \"calico-node-46xxr\" (UID: \"eed9480f-a11e-489d-85cf-16bcfab6aa4c\") " pod="calico-system/calico-node-46xxr" Jan 14 01:32:06.176909 kubelet[2924]: I0114 01:32:06.176713 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/eed9480f-a11e-489d-85cf-16bcfab6aa4c-cni-bin-dir\") pod \"calico-node-46xxr\" (UID: \"eed9480f-a11e-489d-85cf-16bcfab6aa4c\") " pod="calico-system/calico-node-46xxr" Jan 14 01:32:06.176909 kubelet[2924]: I0114 01:32:06.176727 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/eed9480f-a11e-489d-85cf-16bcfab6aa4c-cni-log-dir\") pod \"calico-node-46xxr\" (UID: \"eed9480f-a11e-489d-85cf-16bcfab6aa4c\") " pod="calico-system/calico-node-46xxr" Jan 14 01:32:06.176909 kubelet[2924]: I0114 01:32:06.176739 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/eed9480f-a11e-489d-85cf-16bcfab6aa4c-var-run-calico\") pod \"calico-node-46xxr\" (UID: \"eed9480f-a11e-489d-85cf-16bcfab6aa4c\") " pod="calico-system/calico-node-46xxr" Jan 14 01:32:06.177025 kubelet[2924]: I0114 01:32:06.176756 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/eed9480f-a11e-489d-85cf-16bcfab6aa4c-node-certs\") pod \"calico-node-46xxr\" (UID: \"eed9480f-a11e-489d-85cf-16bcfab6aa4c\") " pod="calico-system/calico-node-46xxr" Jan 14 01:32:06.177025 kubelet[2924]: I0114 01:32:06.176772 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/eed9480f-a11e-489d-85cf-16bcfab6aa4c-lib-modules\") pod \"calico-node-46xxr\" (UID: \"eed9480f-a11e-489d-85cf-16bcfab6aa4c\") " pod="calico-system/calico-node-46xxr" Jan 14 01:32:06.204698 containerd[1684]: time="2026-01-14T01:32:06.204662252Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-69fb88f466-8zm7j,Uid:d69b0aa0-16c1-4c29-86b5-424d99c12b4b,Namespace:calico-system,Attempt:0,}" Jan 14 01:32:06.213000 audit[3327]: NETFILTER_CFG table=filter:115 family=2 entries=21 op=nft_register_rule pid=3327 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:32:06.213000 audit[3327]: SYSCALL arch=c000003e syscall=46 success=yes exit=8224 a0=3 a1=7ffecbe0ebb0 a2=0 a3=7ffecbe0eb9c items=0 ppid=3028 pid=3327 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:06.213000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:32:06.217000 audit[3327]: NETFILTER_CFG table=nat:116 family=2 entries=12 op=nft_register_rule pid=3327 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:32:06.217000 audit[3327]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffecbe0ebb0 a2=0 a3=0 items=0 ppid=3028 pid=3327 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:06.217000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:32:06.233060 containerd[1684]: time="2026-01-14T01:32:06.233022277Z" level=info msg="connecting to shim 2b881cdb8f7de8a825950033b3cd54d9f4ee50f6a4ba3299f9b170473d97e836" address="unix:///run/containerd/s/7f7963632091cd9ef4a08c81df4d89d9bd8b2b4f3f02117d6407299fa499c4af" namespace=k8s.io protocol=ttrpc version=3 Jan 14 01:32:06.266715 kubelet[2924]: E0114 01:32:06.266418 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-bfb5m" podUID="e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2" Jan 14 01:32:06.271191 systemd[1]: Started cri-containerd-2b881cdb8f7de8a825950033b3cd54d9f4ee50f6a4ba3299f9b170473d97e836.scope - libcontainer container 2b881cdb8f7de8a825950033b3cd54d9f4ee50f6a4ba3299f9b170473d97e836. Jan 14 01:32:06.280525 kubelet[2924]: E0114 01:32:06.280455 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.280525 kubelet[2924]: W0114 01:32:06.280472 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.280525 kubelet[2924]: E0114 01:32:06.280497 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.288616 kubelet[2924]: E0114 01:32:06.288509 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.288616 kubelet[2924]: W0114 01:32:06.288526 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.288616 kubelet[2924]: E0114 01:32:06.288541 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.296749 kubelet[2924]: E0114 01:32:06.296673 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.296749 kubelet[2924]: W0114 01:32:06.296690 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.296749 kubelet[2924]: E0114 01:32:06.296711 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.299000 audit: BPF prog-id=151 op=LOAD Jan 14 01:32:06.299000 audit: BPF prog-id=152 op=LOAD Jan 14 01:32:06.299000 audit[3347]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=3336 pid=3347 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:06.299000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3262383831636462386637646538613832353935303033336233636435 Jan 14 01:32:06.299000 audit: BPF prog-id=152 op=UNLOAD Jan 14 01:32:06.299000 audit[3347]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3336 pid=3347 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:06.299000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3262383831636462386637646538613832353935303033336233636435 Jan 14 01:32:06.299000 audit: BPF prog-id=153 op=LOAD Jan 14 01:32:06.299000 audit[3347]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=3336 pid=3347 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:06.299000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3262383831636462386637646538613832353935303033336233636435 Jan 14 01:32:06.299000 audit: BPF prog-id=154 op=LOAD Jan 14 01:32:06.299000 audit[3347]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=3336 pid=3347 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:06.299000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3262383831636462386637646538613832353935303033336233636435 Jan 14 01:32:06.299000 audit: BPF prog-id=154 op=UNLOAD Jan 14 01:32:06.299000 audit[3347]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3336 pid=3347 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:06.299000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3262383831636462386637646538613832353935303033336233636435 Jan 14 01:32:06.299000 audit: BPF prog-id=153 op=UNLOAD Jan 14 01:32:06.299000 audit[3347]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3336 pid=3347 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:06.299000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3262383831636462386637646538613832353935303033336233636435 Jan 14 01:32:06.299000 audit: BPF prog-id=155 op=LOAD Jan 14 01:32:06.299000 audit[3347]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=3336 pid=3347 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:06.299000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3262383831636462386637646538613832353935303033336233636435 Jan 14 01:32:06.346001 containerd[1684]: time="2026-01-14T01:32:06.345895423Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-69fb88f466-8zm7j,Uid:d69b0aa0-16c1-4c29-86b5-424d99c12b4b,Namespace:calico-system,Attempt:0,} returns sandbox id \"2b881cdb8f7de8a825950033b3cd54d9f4ee50f6a4ba3299f9b170473d97e836\"" Jan 14 01:32:06.348968 containerd[1684]: time="2026-01-14T01:32:06.348942156Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\"" Jan 14 01:32:06.366088 kubelet[2924]: E0114 01:32:06.366063 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.366211 kubelet[2924]: W0114 01:32:06.366169 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.366211 kubelet[2924]: E0114 01:32:06.366190 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.366487 kubelet[2924]: E0114 01:32:06.366434 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.366487 kubelet[2924]: W0114 01:32:06.366446 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.366487 kubelet[2924]: E0114 01:32:06.366456 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.366780 kubelet[2924]: E0114 01:32:06.366770 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.366780 kubelet[2924]: W0114 01:32:06.366779 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.366836 kubelet[2924]: E0114 01:32:06.366788 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.367166 kubelet[2924]: E0114 01:32:06.367156 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.367166 kubelet[2924]: W0114 01:32:06.367165 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.367220 kubelet[2924]: E0114 01:32:06.367173 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.367555 kubelet[2924]: E0114 01:32:06.367545 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.367579 kubelet[2924]: W0114 01:32:06.367554 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.367579 kubelet[2924]: E0114 01:32:06.367563 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.367942 kubelet[2924]: E0114 01:32:06.367931 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.367970 kubelet[2924]: W0114 01:32:06.367942 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.367970 kubelet[2924]: E0114 01:32:06.367951 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.368396 kubelet[2924]: E0114 01:32:06.368385 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.368428 kubelet[2924]: W0114 01:32:06.368396 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.368428 kubelet[2924]: E0114 01:32:06.368406 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.368816 kubelet[2924]: E0114 01:32:06.368804 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.368816 kubelet[2924]: W0114 01:32:06.368815 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.368888 kubelet[2924]: E0114 01:32:06.368825 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.369114 kubelet[2924]: E0114 01:32:06.369103 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.369172 kubelet[2924]: W0114 01:32:06.369114 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.369201 kubelet[2924]: E0114 01:32:06.369171 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.369547 kubelet[2924]: E0114 01:32:06.369536 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.369577 kubelet[2924]: W0114 01:32:06.369547 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.369577 kubelet[2924]: E0114 01:32:06.369561 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.369920 kubelet[2924]: E0114 01:32:06.369909 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.369920 kubelet[2924]: W0114 01:32:06.369919 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.369974 kubelet[2924]: E0114 01:32:06.369927 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.370163 kubelet[2924]: E0114 01:32:06.370152 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.370163 kubelet[2924]: W0114 01:32:06.370161 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.370209 kubelet[2924]: E0114 01:32:06.370168 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.370523 kubelet[2924]: E0114 01:32:06.370511 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.370552 kubelet[2924]: W0114 01:32:06.370522 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.370552 kubelet[2924]: E0114 01:32:06.370532 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.370870 kubelet[2924]: E0114 01:32:06.370839 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.370899 kubelet[2924]: W0114 01:32:06.370869 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.370899 kubelet[2924]: E0114 01:32:06.370878 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.371403 kubelet[2924]: E0114 01:32:06.371139 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.371403 kubelet[2924]: W0114 01:32:06.371150 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.371403 kubelet[2924]: E0114 01:32:06.371212 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.371568 kubelet[2924]: E0114 01:32:06.371556 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.371568 kubelet[2924]: W0114 01:32:06.371566 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.371617 kubelet[2924]: E0114 01:32:06.371575 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.371948 kubelet[2924]: E0114 01:32:06.371899 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.371948 kubelet[2924]: W0114 01:32:06.371909 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.371948 kubelet[2924]: E0114 01:32:06.371917 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.372501 kubelet[2924]: E0114 01:32:06.372443 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.372501 kubelet[2924]: W0114 01:32:06.372454 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.372501 kubelet[2924]: E0114 01:32:06.372462 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.372923 kubelet[2924]: E0114 01:32:06.372909 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.372923 kubelet[2924]: W0114 01:32:06.372919 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.372994 kubelet[2924]: E0114 01:32:06.372928 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.373221 kubelet[2924]: E0114 01:32:06.373210 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.373221 kubelet[2924]: W0114 01:32:06.373220 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.373266 kubelet[2924]: E0114 01:32:06.373228 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.378532 kubelet[2924]: E0114 01:32:06.378512 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.378532 kubelet[2924]: W0114 01:32:06.378524 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.378532 kubelet[2924]: E0114 01:32:06.378534 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.378642 kubelet[2924]: I0114 01:32:06.378568 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-545mr\" (UniqueName: \"kubernetes.io/projected/e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2-kube-api-access-545mr\") pod \"csi-node-driver-bfb5m\" (UID: \"e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2\") " pod="calico-system/csi-node-driver-bfb5m" Jan 14 01:32:06.378829 kubelet[2924]: E0114 01:32:06.378722 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.378829 kubelet[2924]: W0114 01:32:06.378732 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.378829 kubelet[2924]: E0114 01:32:06.378739 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.378829 kubelet[2924]: I0114 01:32:06.378752 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2-kubelet-dir\") pod \"csi-node-driver-bfb5m\" (UID: \"e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2\") " pod="calico-system/csi-node-driver-bfb5m" Jan 14 01:32:06.379456 kubelet[2924]: E0114 01:32:06.379355 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.379456 kubelet[2924]: W0114 01:32:06.379367 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.379456 kubelet[2924]: E0114 01:32:06.379383 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.379626 kubelet[2924]: E0114 01:32:06.379619 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.379668 kubelet[2924]: W0114 01:32:06.379662 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.379778 kubelet[2924]: E0114 01:32:06.379708 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.381053 kubelet[2924]: E0114 01:32:06.380926 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.381053 kubelet[2924]: W0114 01:32:06.380939 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.381053 kubelet[2924]: E0114 01:32:06.380953 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.381053 kubelet[2924]: I0114 01:32:06.380973 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2-registration-dir\") pod \"csi-node-driver-bfb5m\" (UID: \"e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2\") " pod="calico-system/csi-node-driver-bfb5m" Jan 14 01:32:06.381261 kubelet[2924]: E0114 01:32:06.381189 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.381261 kubelet[2924]: W0114 01:32:06.381198 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.381261 kubelet[2924]: E0114 01:32:06.381217 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.381335 kubelet[2924]: I0114 01:32:06.381248 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2-socket-dir\") pod \"csi-node-driver-bfb5m\" (UID: \"e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2\") " pod="calico-system/csi-node-driver-bfb5m" Jan 14 01:32:06.381448 kubelet[2924]: E0114 01:32:06.381380 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.381448 kubelet[2924]: W0114 01:32:06.381387 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.381448 kubelet[2924]: E0114 01:32:06.381401 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.381621 kubelet[2924]: E0114 01:32:06.381548 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.381621 kubelet[2924]: W0114 01:32:06.381554 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.381621 kubelet[2924]: E0114 01:32:06.381565 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.381804 kubelet[2924]: E0114 01:32:06.381713 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.381804 kubelet[2924]: W0114 01:32:06.381719 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.381804 kubelet[2924]: E0114 01:32:06.381731 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.381804 kubelet[2924]: I0114 01:32:06.381746 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2-varrun\") pod \"csi-node-driver-bfb5m\" (UID: \"e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2\") " pod="calico-system/csi-node-driver-bfb5m" Jan 14 01:32:06.382007 kubelet[2924]: E0114 01:32:06.381943 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.382007 kubelet[2924]: W0114 01:32:06.381950 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.382007 kubelet[2924]: E0114 01:32:06.381959 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.382145 kubelet[2924]: E0114 01:32:06.382139 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.382186 kubelet[2924]: W0114 01:32:06.382180 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.382220 kubelet[2924]: E0114 01:32:06.382215 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.382487 kubelet[2924]: E0114 01:32:06.382428 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.382487 kubelet[2924]: W0114 01:32:06.382436 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.382487 kubelet[2924]: E0114 01:32:06.382442 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.382633 kubelet[2924]: E0114 01:32:06.382628 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.382666 kubelet[2924]: W0114 01:32:06.382661 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.382756 kubelet[2924]: E0114 01:32:06.382695 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.382831 kubelet[2924]: E0114 01:32:06.382825 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.382897 kubelet[2924]: W0114 01:32:06.382890 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.383001 kubelet[2924]: E0114 01:32:06.382932 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.383073 kubelet[2924]: E0114 01:32:06.383067 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.383107 kubelet[2924]: W0114 01:32:06.383102 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.383139 kubelet[2924]: E0114 01:32:06.383134 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.402863 containerd[1684]: time="2026-01-14T01:32:06.402689335Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-46xxr,Uid:eed9480f-a11e-489d-85cf-16bcfab6aa4c,Namespace:calico-system,Attempt:0,}" Jan 14 01:32:06.433320 containerd[1684]: time="2026-01-14T01:32:06.433219986Z" level=info msg="connecting to shim c8e8830ff37a3c2550983126a109b1927210e3215e2ce4f8006c0bd02ebd6563" address="unix:///run/containerd/s/9afe881497127ae7ad0801ea13f2921bc7ec8dcbcf9a38e2816f34e6e826a3c8" namespace=k8s.io protocol=ttrpc version=3 Jan 14 01:32:06.459081 systemd[1]: Started cri-containerd-c8e8830ff37a3c2550983126a109b1927210e3215e2ce4f8006c0bd02ebd6563.scope - libcontainer container c8e8830ff37a3c2550983126a109b1927210e3215e2ce4f8006c0bd02ebd6563. Jan 14 01:32:06.472972 kernel: kauditd_printk_skb: 53 callbacks suppressed Jan 14 01:32:06.473069 kernel: audit: type=1334 audit(1768354326.469:542): prog-id=156 op=LOAD Jan 14 01:32:06.469000 audit: BPF prog-id=156 op=LOAD Jan 14 01:32:06.475039 kernel: audit: type=1334 audit(1768354326.470:543): prog-id=157 op=LOAD Jan 14 01:32:06.470000 audit: BPF prog-id=157 op=LOAD Jan 14 01:32:06.470000 audit[3439]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=3428 pid=3439 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:06.477190 kernel: audit: type=1300 audit(1768354326.470:543): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=3428 pid=3439 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:06.470000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6338653838333066663337613363323535303938333132366131303962 Jan 14 01:32:06.481827 kernel: audit: type=1327 audit(1768354326.470:543): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6338653838333066663337613363323535303938333132366131303962 Jan 14 01:32:06.483273 kubelet[2924]: E0114 01:32:06.483232 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.483273 kubelet[2924]: W0114 01:32:06.483247 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.483449 kubelet[2924]: E0114 01:32:06.483377 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.483659 kubelet[2924]: E0114 01:32:06.483581 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.483659 kubelet[2924]: W0114 01:32:06.483588 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.483659 kubelet[2924]: E0114 01:32:06.483596 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.485056 kernel: audit: type=1334 audit(1768354326.470:544): prog-id=157 op=UNLOAD Jan 14 01:32:06.470000 audit: BPF prog-id=157 op=UNLOAD Jan 14 01:32:06.485138 kubelet[2924]: E0114 01:32:06.484416 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.485138 kubelet[2924]: W0114 01:32:06.484426 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.485572 kubelet[2924]: E0114 01:32:06.485305 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.486662 kubelet[2924]: E0114 01:32:06.486544 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.487031 kubelet[2924]: W0114 01:32:06.486860 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.487031 kubelet[2924]: E0114 01:32:06.486905 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.487323 kubelet[2924]: E0114 01:32:06.487282 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.487323 kubelet[2924]: W0114 01:32:06.487292 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.487323 kubelet[2924]: E0114 01:32:06.487316 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.487737 kubelet[2924]: E0114 01:32:06.487686 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.487737 kubelet[2924]: W0114 01:32:06.487695 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.487737 kubelet[2924]: E0114 01:32:06.487717 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.488020 kubelet[2924]: E0114 01:32:06.487959 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.488020 kubelet[2924]: W0114 01:32:06.487967 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.488020 kubelet[2924]: E0114 01:32:06.487984 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.488212 kubelet[2924]: E0114 01:32:06.488155 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.488212 kubelet[2924]: W0114 01:32:06.488162 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.488212 kubelet[2924]: E0114 01:32:06.488176 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.470000 audit[3439]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3428 pid=3439 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:06.488597 kubelet[2924]: E0114 01:32:06.488454 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.488597 kubelet[2924]: W0114 01:32:06.488463 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.488972 kubelet[2924]: E0114 01:32:06.488895 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.488972 kubelet[2924]: W0114 01:32:06.488902 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.489114 kubelet[2924]: E0114 01:32:06.489107 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.489156 kubelet[2924]: W0114 01:32:06.489150 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.489494 kubelet[2924]: E0114 01:32:06.489353 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.489494 kubelet[2924]: W0114 01:32:06.489443 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.489617 kubelet[2924]: E0114 01:32:06.489612 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.489721 kubelet[2924]: W0114 01:32:06.489661 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.489721 kubelet[2924]: E0114 01:32:06.489672 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.489962 kubelet[2924]: E0114 01:32:06.489924 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.490138 kubelet[2924]: W0114 01:32:06.490001 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.490138 kubelet[2924]: E0114 01:32:06.490012 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.490261 kubelet[2924]: E0114 01:32:06.490255 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.490309 kubelet[2924]: W0114 01:32:06.490303 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.490486 kubelet[2924]: E0114 01:32:06.490367 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.490601 kubelet[2924]: E0114 01:32:06.490594 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.490635 kubelet[2924]: W0114 01:32:06.490629 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.490674 kubelet[2924]: E0114 01:32:06.490668 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.490725 kubelet[2924]: E0114 01:32:06.490719 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.490937 kubelet[2924]: E0114 01:32:06.490879 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.490937 kubelet[2924]: W0114 01:32:06.490886 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.490937 kubelet[2924]: E0114 01:32:06.490892 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.491146 kubelet[2924]: E0114 01:32:06.491069 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.491146 kubelet[2924]: W0114 01:32:06.491076 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.491146 kubelet[2924]: E0114 01:32:06.491082 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.491287 kubelet[2924]: E0114 01:32:06.491281 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.491587 kubelet[2924]: W0114 01:32:06.491314 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.491587 kubelet[2924]: E0114 01:32:06.491322 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.491812 kubelet[2924]: E0114 01:32:06.491765 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.492642 kubelet[2924]: E0114 01:32:06.492632 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.492702 kubelet[2924]: W0114 01:32:06.492695 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.492743 kubelet[2924]: E0114 01:32:06.492735 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.492859 kernel: audit: type=1300 audit(1768354326.470:544): arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3428 pid=3439 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:06.493020 kubelet[2924]: E0114 01:32:06.492961 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.493020 kubelet[2924]: E0114 01:32:06.492977 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.493020 kubelet[2924]: W0114 01:32:06.492984 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.493020 kubelet[2924]: E0114 01:32:06.492983 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.493020 kubelet[2924]: E0114 01:32:06.492993 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.494903 kubelet[2924]: E0114 01:32:06.494886 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.494903 kubelet[2924]: W0114 01:32:06.494898 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.494979 kubelet[2924]: E0114 01:32:06.494910 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.495078 kubelet[2924]: E0114 01:32:06.495067 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.495078 kubelet[2924]: W0114 01:32:06.495076 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.495127 kubelet[2924]: E0114 01:32:06.495083 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.495457 kubelet[2924]: E0114 01:32:06.495215 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.495457 kubelet[2924]: W0114 01:32:06.495224 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.495457 kubelet[2924]: E0114 01:32:06.495230 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.495518 kubelet[2924]: E0114 01:32:06.495496 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.495518 kubelet[2924]: W0114 01:32:06.495502 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.495518 kubelet[2924]: E0114 01:32:06.495509 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.470000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6338653838333066663337613363323535303938333132366131303962 Jan 14 01:32:06.500861 kernel: audit: type=1327 audit(1768354326.470:544): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6338653838333066663337613363323535303938333132366131303962 Jan 14 01:32:06.470000 audit: BPF prog-id=158 op=LOAD Jan 14 01:32:06.504908 kernel: audit: type=1334 audit(1768354326.470:545): prog-id=158 op=LOAD Jan 14 01:32:06.504939 kubelet[2924]: E0114 01:32:06.504915 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:06.504939 kubelet[2924]: W0114 01:32:06.504929 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:06.505015 kubelet[2924]: E0114 01:32:06.504943 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:06.470000 audit[3439]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=3428 pid=3439 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:06.470000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6338653838333066663337613363323535303938333132366131303962 Jan 14 01:32:06.511312 kernel: audit: type=1300 audit(1768354326.470:545): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=3428 pid=3439 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:06.511362 kernel: audit: type=1327 audit(1768354326.470:545): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6338653838333066663337613363323535303938333132366131303962 Jan 14 01:32:06.470000 audit: BPF prog-id=159 op=LOAD Jan 14 01:32:06.470000 audit[3439]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=3428 pid=3439 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:06.470000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6338653838333066663337613363323535303938333132366131303962 Jan 14 01:32:06.470000 audit: BPF prog-id=159 op=UNLOAD Jan 14 01:32:06.470000 audit[3439]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3428 pid=3439 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:06.470000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6338653838333066663337613363323535303938333132366131303962 Jan 14 01:32:06.470000 audit: BPF prog-id=158 op=UNLOAD Jan 14 01:32:06.470000 audit[3439]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3428 pid=3439 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:06.470000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6338653838333066663337613363323535303938333132366131303962 Jan 14 01:32:06.470000 audit: BPF prog-id=160 op=LOAD Jan 14 01:32:06.470000 audit[3439]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=3428 pid=3439 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:06.470000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6338653838333066663337613363323535303938333132366131303962 Jan 14 01:32:06.517546 containerd[1684]: time="2026-01-14T01:32:06.517439906Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-46xxr,Uid:eed9480f-a11e-489d-85cf-16bcfab6aa4c,Namespace:calico-system,Attempt:0,} returns sandbox id \"c8e8830ff37a3c2550983126a109b1927210e3215e2ce4f8006c0bd02ebd6563\"" Jan 14 01:32:07.769914 kubelet[2924]: E0114 01:32:07.768721 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-bfb5m" podUID="e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2" Jan 14 01:32:07.855533 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2294495837.mount: Deactivated successfully. Jan 14 01:32:08.842723 containerd[1684]: time="2026-01-14T01:32:08.842687157Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:32:08.844327 containerd[1684]: time="2026-01-14T01:32:08.844208825Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.4: active requests=0, bytes read=33735893" Jan 14 01:32:08.845812 containerd[1684]: time="2026-01-14T01:32:08.845783060Z" level=info msg="ImageCreate event name:\"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:32:08.848259 containerd[1684]: time="2026-01-14T01:32:08.848208207Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:32:08.848765 containerd[1684]: time="2026-01-14T01:32:08.848721261Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.4\" with image id \"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\", size \"35234482\" in 2.49975254s" Jan 14 01:32:08.848827 containerd[1684]: time="2026-01-14T01:32:08.848817012Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\" returns image reference \"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\"" Jan 14 01:32:08.849579 containerd[1684]: time="2026-01-14T01:32:08.849564942Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\"" Jan 14 01:32:08.863451 containerd[1684]: time="2026-01-14T01:32:08.863419370Z" level=info msg="CreateContainer within sandbox \"2b881cdb8f7de8a825950033b3cd54d9f4ee50f6a4ba3299f9b170473d97e836\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Jan 14 01:32:08.883394 containerd[1684]: time="2026-01-14T01:32:08.883304318Z" level=info msg="Container 19a959898132eeeb90eebb99dc125382e7451d0d95813bb4aa89d857aecde4cf: CDI devices from CRI Config.CDIDevices: []" Jan 14 01:32:08.892896 containerd[1684]: time="2026-01-14T01:32:08.892869031Z" level=info msg="CreateContainer within sandbox \"2b881cdb8f7de8a825950033b3cd54d9f4ee50f6a4ba3299f9b170473d97e836\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"19a959898132eeeb90eebb99dc125382e7451d0d95813bb4aa89d857aecde4cf\"" Jan 14 01:32:08.893880 containerd[1684]: time="2026-01-14T01:32:08.893473930Z" level=info msg="StartContainer for \"19a959898132eeeb90eebb99dc125382e7451d0d95813bb4aa89d857aecde4cf\"" Jan 14 01:32:08.894605 containerd[1684]: time="2026-01-14T01:32:08.894578446Z" level=info msg="connecting to shim 19a959898132eeeb90eebb99dc125382e7451d0d95813bb4aa89d857aecde4cf" address="unix:///run/containerd/s/7f7963632091cd9ef4a08c81df4d89d9bd8b2b4f3f02117d6407299fa499c4af" protocol=ttrpc version=3 Jan 14 01:32:08.913015 systemd[1]: Started cri-containerd-19a959898132eeeb90eebb99dc125382e7451d0d95813bb4aa89d857aecde4cf.scope - libcontainer container 19a959898132eeeb90eebb99dc125382e7451d0d95813bb4aa89d857aecde4cf. Jan 14 01:32:08.926000 audit: BPF prog-id=161 op=LOAD Jan 14 01:32:08.926000 audit: BPF prog-id=162 op=LOAD Jan 14 01:32:08.926000 audit[3506]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a0238 a2=98 a3=0 items=0 ppid=3336 pid=3506 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:08.926000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3139613935393839383133326565656239306565626239396463313235 Jan 14 01:32:08.926000 audit: BPF prog-id=162 op=UNLOAD Jan 14 01:32:08.926000 audit[3506]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3336 pid=3506 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:08.926000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3139613935393839383133326565656239306565626239396463313235 Jan 14 01:32:08.926000 audit: BPF prog-id=163 op=LOAD Jan 14 01:32:08.926000 audit[3506]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a0488 a2=98 a3=0 items=0 ppid=3336 pid=3506 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:08.926000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3139613935393839383133326565656239306565626239396463313235 Jan 14 01:32:08.926000 audit: BPF prog-id=164 op=LOAD Jan 14 01:32:08.926000 audit[3506]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001a0218 a2=98 a3=0 items=0 ppid=3336 pid=3506 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:08.926000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3139613935393839383133326565656239306565626239396463313235 Jan 14 01:32:08.926000 audit: BPF prog-id=164 op=UNLOAD Jan 14 01:32:08.926000 audit[3506]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3336 pid=3506 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:08.926000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3139613935393839383133326565656239306565626239396463313235 Jan 14 01:32:08.926000 audit: BPF prog-id=163 op=UNLOAD Jan 14 01:32:08.926000 audit[3506]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3336 pid=3506 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:08.926000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3139613935393839383133326565656239306565626239396463313235 Jan 14 01:32:08.926000 audit: BPF prog-id=165 op=LOAD Jan 14 01:32:08.926000 audit[3506]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a06e8 a2=98 a3=0 items=0 ppid=3336 pid=3506 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:08.926000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3139613935393839383133326565656239306565626239396463313235 Jan 14 01:32:08.962868 containerd[1684]: time="2026-01-14T01:32:08.962816569Z" level=info msg="StartContainer for \"19a959898132eeeb90eebb99dc125382e7451d0d95813bb4aa89d857aecde4cf\" returns successfully" Jan 14 01:32:09.769045 kubelet[2924]: E0114 01:32:09.768613 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-bfb5m" podUID="e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2" Jan 14 01:32:09.844338 kubelet[2924]: I0114 01:32:09.843634 2924 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-69fb88f466-8zm7j" podStartSLOduration=2.342646849 podStartE2EDuration="4.843613269s" podCreationTimestamp="2026-01-14 01:32:05 +0000 UTC" firstStartedPulling="2026-01-14 01:32:06.348460648 +0000 UTC m=+19.680689877" lastFinishedPulling="2026-01-14 01:32:08.849427067 +0000 UTC m=+22.181656297" observedRunningTime="2026-01-14 01:32:09.842759509 +0000 UTC m=+23.174988775" watchObservedRunningTime="2026-01-14 01:32:09.843613269 +0000 UTC m=+23.175842564" Jan 14 01:32:09.895989 kubelet[2924]: E0114 01:32:09.895943 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:09.895989 kubelet[2924]: W0114 01:32:09.895967 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:09.895989 kubelet[2924]: E0114 01:32:09.895986 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:09.896187 kubelet[2924]: E0114 01:32:09.896138 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:09.896187 kubelet[2924]: W0114 01:32:09.896144 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:09.896187 kubelet[2924]: E0114 01:32:09.896151 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:09.896319 kubelet[2924]: E0114 01:32:09.896298 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:09.896319 kubelet[2924]: W0114 01:32:09.896307 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:09.896319 kubelet[2924]: E0114 01:32:09.896313 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:09.896524 kubelet[2924]: E0114 01:32:09.896513 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:09.896524 kubelet[2924]: W0114 01:32:09.896521 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:09.896573 kubelet[2924]: E0114 01:32:09.896527 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:09.896690 kubelet[2924]: E0114 01:32:09.896674 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:09.896690 kubelet[2924]: W0114 01:32:09.896682 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:09.896690 kubelet[2924]: E0114 01:32:09.896688 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:09.896819 kubelet[2924]: E0114 01:32:09.896809 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:09.896819 kubelet[2924]: W0114 01:32:09.896817 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:09.896883 kubelet[2924]: E0114 01:32:09.896823 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:09.896972 kubelet[2924]: E0114 01:32:09.896962 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:09.896972 kubelet[2924]: W0114 01:32:09.896970 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:09.897027 kubelet[2924]: E0114 01:32:09.896976 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:09.897111 kubelet[2924]: E0114 01:32:09.897101 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:09.897111 kubelet[2924]: W0114 01:32:09.897109 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:09.897159 kubelet[2924]: E0114 01:32:09.897115 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:09.897251 kubelet[2924]: E0114 01:32:09.897242 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:09.897251 kubelet[2924]: W0114 01:32:09.897249 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:09.897306 kubelet[2924]: E0114 01:32:09.897255 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:09.897377 kubelet[2924]: E0114 01:32:09.897368 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:09.897404 kubelet[2924]: W0114 01:32:09.897378 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:09.897404 kubelet[2924]: E0114 01:32:09.897383 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:09.897502 kubelet[2924]: E0114 01:32:09.897488 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:09.897502 kubelet[2924]: W0114 01:32:09.897500 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:09.897553 kubelet[2924]: E0114 01:32:09.897505 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:09.897627 kubelet[2924]: E0114 01:32:09.897618 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:09.897627 kubelet[2924]: W0114 01:32:09.897627 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:09.897671 kubelet[2924]: E0114 01:32:09.897632 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:09.897774 kubelet[2924]: E0114 01:32:09.897765 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:09.897774 kubelet[2924]: W0114 01:32:09.897773 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:09.897823 kubelet[2924]: E0114 01:32:09.897778 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:09.897913 kubelet[2924]: E0114 01:32:09.897904 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:09.897913 kubelet[2924]: W0114 01:32:09.897912 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:09.897964 kubelet[2924]: E0114 01:32:09.897918 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:09.898051 kubelet[2924]: E0114 01:32:09.898041 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:09.898051 kubelet[2924]: W0114 01:32:09.898049 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:09.898104 kubelet[2924]: E0114 01:32:09.898054 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:09.909623 kubelet[2924]: E0114 01:32:09.909573 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:09.909623 kubelet[2924]: W0114 01:32:09.909595 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:09.909623 kubelet[2924]: E0114 01:32:09.909611 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:09.909930 kubelet[2924]: E0114 01:32:09.909773 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:09.909930 kubelet[2924]: W0114 01:32:09.909780 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:09.909930 kubelet[2924]: E0114 01:32:09.909787 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:09.910077 kubelet[2924]: E0114 01:32:09.909976 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:09.910077 kubelet[2924]: W0114 01:32:09.909982 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:09.910077 kubelet[2924]: E0114 01:32:09.909996 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:09.910200 kubelet[2924]: E0114 01:32:09.910143 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:09.910200 kubelet[2924]: W0114 01:32:09.910149 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:09.910200 kubelet[2924]: E0114 01:32:09.910161 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:09.910312 kubelet[2924]: E0114 01:32:09.910294 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:09.910312 kubelet[2924]: W0114 01:32:09.910300 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:09.910312 kubelet[2924]: E0114 01:32:09.910310 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:09.910482 kubelet[2924]: E0114 01:32:09.910465 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:09.910482 kubelet[2924]: W0114 01:32:09.910474 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:09.910571 kubelet[2924]: E0114 01:32:09.910486 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:09.910613 kubelet[2924]: E0114 01:32:09.910592 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:09.910613 kubelet[2924]: W0114 01:32:09.910597 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:09.910613 kubelet[2924]: E0114 01:32:09.910603 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:09.911074 kubelet[2924]: E0114 01:32:09.911049 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:09.911074 kubelet[2924]: W0114 01:32:09.911062 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:09.911074 kubelet[2924]: E0114 01:32:09.911075 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:09.911256 kubelet[2924]: E0114 01:32:09.911202 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:09.911256 kubelet[2924]: W0114 01:32:09.911208 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:09.911256 kubelet[2924]: E0114 01:32:09.911215 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:09.911398 kubelet[2924]: E0114 01:32:09.911333 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:09.911398 kubelet[2924]: W0114 01:32:09.911339 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:09.911398 kubelet[2924]: E0114 01:32:09.911347 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:09.911526 kubelet[2924]: E0114 01:32:09.911481 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:09.911526 kubelet[2924]: W0114 01:32:09.911487 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:09.911526 kubelet[2924]: E0114 01:32:09.911493 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:09.911667 kubelet[2924]: E0114 01:32:09.911613 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:09.911667 kubelet[2924]: W0114 01:32:09.911619 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:09.911667 kubelet[2924]: E0114 01:32:09.911624 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:09.911917 kubelet[2924]: E0114 01:32:09.911904 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:09.911917 kubelet[2924]: W0114 01:32:09.911912 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:09.911917 kubelet[2924]: E0114 01:32:09.911919 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:09.912049 kubelet[2924]: E0114 01:32:09.912030 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:09.912049 kubelet[2924]: W0114 01:32:09.912036 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:09.912130 kubelet[2924]: E0114 01:32:09.912050 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:09.912199 kubelet[2924]: E0114 01:32:09.912165 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:09.912199 kubelet[2924]: W0114 01:32:09.912171 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:09.912199 kubelet[2924]: E0114 01:32:09.912185 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:09.912354 kubelet[2924]: E0114 01:32:09.912339 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:09.912354 kubelet[2924]: W0114 01:32:09.912344 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:09.912430 kubelet[2924]: E0114 01:32:09.912356 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:09.912590 kubelet[2924]: E0114 01:32:09.912565 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:09.912590 kubelet[2924]: W0114 01:32:09.912583 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:09.912590 kubelet[2924]: E0114 01:32:09.912592 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:09.912732 kubelet[2924]: E0114 01:32:09.912717 2924 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:32:09.912732 kubelet[2924]: W0114 01:32:09.912722 2924 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:32:09.912732 kubelet[2924]: E0114 01:32:09.912728 2924 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:32:10.379096 containerd[1684]: time="2026-01-14T01:32:10.379040913Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:32:10.380519 containerd[1684]: time="2026-01-14T01:32:10.380493726Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4: active requests=0, bytes read=0" Jan 14 01:32:10.382469 containerd[1684]: time="2026-01-14T01:32:10.382453567Z" level=info msg="ImageCreate event name:\"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:32:10.385120 containerd[1684]: time="2026-01-14T01:32:10.385096707Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:32:10.386390 containerd[1684]: time="2026-01-14T01:32:10.386346533Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" with image id \"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\", size \"5941314\" in 1.536544865s" Jan 14 01:32:10.386452 containerd[1684]: time="2026-01-14T01:32:10.386391616Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" returns image reference \"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\"" Jan 14 01:32:10.390684 containerd[1684]: time="2026-01-14T01:32:10.390657709Z" level=info msg="CreateContainer within sandbox \"c8e8830ff37a3c2550983126a109b1927210e3215e2ce4f8006c0bd02ebd6563\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Jan 14 01:32:10.406133 containerd[1684]: time="2026-01-14T01:32:10.403430844Z" level=info msg="Container 527c88e1e69b9e8b23e98ce22d88f69667f217c0ce0ee9b1a4564c049cee532e: CDI devices from CRI Config.CDIDevices: []" Jan 14 01:32:10.412449 containerd[1684]: time="2026-01-14T01:32:10.412419250Z" level=info msg="CreateContainer within sandbox \"c8e8830ff37a3c2550983126a109b1927210e3215e2ce4f8006c0bd02ebd6563\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"527c88e1e69b9e8b23e98ce22d88f69667f217c0ce0ee9b1a4564c049cee532e\"" Jan 14 01:32:10.413829 containerd[1684]: time="2026-01-14T01:32:10.413051353Z" level=info msg="StartContainer for \"527c88e1e69b9e8b23e98ce22d88f69667f217c0ce0ee9b1a4564c049cee532e\"" Jan 14 01:32:10.414450 containerd[1684]: time="2026-01-14T01:32:10.414407323Z" level=info msg="connecting to shim 527c88e1e69b9e8b23e98ce22d88f69667f217c0ce0ee9b1a4564c049cee532e" address="unix:///run/containerd/s/9afe881497127ae7ad0801ea13f2921bc7ec8dcbcf9a38e2816f34e6e826a3c8" protocol=ttrpc version=3 Jan 14 01:32:10.437028 systemd[1]: Started cri-containerd-527c88e1e69b9e8b23e98ce22d88f69667f217c0ce0ee9b1a4564c049cee532e.scope - libcontainer container 527c88e1e69b9e8b23e98ce22d88f69667f217c0ce0ee9b1a4564c049cee532e. Jan 14 01:32:10.485000 audit: BPF prog-id=166 op=LOAD Jan 14 01:32:10.485000 audit[3581]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000138488 a2=98 a3=0 items=0 ppid=3428 pid=3581 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:10.485000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3532376338386531653639623965386232336539386365323264383866 Jan 14 01:32:10.485000 audit: BPF prog-id=167 op=LOAD Jan 14 01:32:10.485000 audit[3581]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000138218 a2=98 a3=0 items=0 ppid=3428 pid=3581 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:10.485000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3532376338386531653639623965386232336539386365323264383866 Jan 14 01:32:10.485000 audit: BPF prog-id=167 op=UNLOAD Jan 14 01:32:10.485000 audit[3581]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3428 pid=3581 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:10.485000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3532376338386531653639623965386232336539386365323264383866 Jan 14 01:32:10.485000 audit: BPF prog-id=166 op=UNLOAD Jan 14 01:32:10.485000 audit[3581]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3428 pid=3581 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:10.485000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3532376338386531653639623965386232336539386365323264383866 Jan 14 01:32:10.485000 audit: BPF prog-id=168 op=LOAD Jan 14 01:32:10.485000 audit[3581]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001386e8 a2=98 a3=0 items=0 ppid=3428 pid=3581 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:10.485000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3532376338386531653639623965386232336539386365323264383866 Jan 14 01:32:10.508192 containerd[1684]: time="2026-01-14T01:32:10.508160569Z" level=info msg="StartContainer for \"527c88e1e69b9e8b23e98ce22d88f69667f217c0ce0ee9b1a4564c049cee532e\" returns successfully" Jan 14 01:32:10.515036 systemd[1]: cri-containerd-527c88e1e69b9e8b23e98ce22d88f69667f217c0ce0ee9b1a4564c049cee532e.scope: Deactivated successfully. Jan 14 01:32:10.517000 audit: BPF prog-id=168 op=UNLOAD Jan 14 01:32:10.518411 containerd[1684]: time="2026-01-14T01:32:10.518384170Z" level=info msg="received container exit event container_id:\"527c88e1e69b9e8b23e98ce22d88f69667f217c0ce0ee9b1a4564c049cee532e\" id:\"527c88e1e69b9e8b23e98ce22d88f69667f217c0ce0ee9b1a4564c049cee532e\" pid:3594 exited_at:{seconds:1768354330 nanos:517579046}" Jan 14 01:32:10.554245 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-527c88e1e69b9e8b23e98ce22d88f69667f217c0ce0ee9b1a4564c049cee532e-rootfs.mount: Deactivated successfully. Jan 14 01:32:10.835024 kubelet[2924]: I0114 01:32:10.835001 2924 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 14 01:32:11.768871 kubelet[2924]: E0114 01:32:11.768634 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-bfb5m" podUID="e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2" Jan 14 01:32:12.842130 containerd[1684]: time="2026-01-14T01:32:12.841871700Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\"" Jan 14 01:32:13.767982 kubelet[2924]: E0114 01:32:13.767921 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-bfb5m" podUID="e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2" Jan 14 01:32:15.771181 kubelet[2924]: E0114 01:32:15.770184 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-bfb5m" podUID="e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2" Jan 14 01:32:16.615863 containerd[1684]: time="2026-01-14T01:32:16.615666536Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:32:16.616945 containerd[1684]: time="2026-01-14T01:32:16.616919794Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.4: active requests=0, bytes read=70442291" Jan 14 01:32:16.618337 containerd[1684]: time="2026-01-14T01:32:16.618311706Z" level=info msg="ImageCreate event name:\"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:32:16.623640 containerd[1684]: time="2026-01-14T01:32:16.623586374Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:32:16.624516 containerd[1684]: time="2026-01-14T01:32:16.624105188Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.4\" with image id \"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\", size \"71941459\" in 3.782202356s" Jan 14 01:32:16.624516 containerd[1684]: time="2026-01-14T01:32:16.624135145Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\" returns image reference \"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\"" Jan 14 01:32:16.626485 containerd[1684]: time="2026-01-14T01:32:16.626424152Z" level=info msg="CreateContainer within sandbox \"c8e8830ff37a3c2550983126a109b1927210e3215e2ce4f8006c0bd02ebd6563\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Jan 14 01:32:16.638711 containerd[1684]: time="2026-01-14T01:32:16.638678282Z" level=info msg="Container 4f6272efb5c5f549e0a57ec3c6ea3768280e6d04a437389be29e4791419ed936: CDI devices from CRI Config.CDIDevices: []" Jan 14 01:32:16.649460 containerd[1684]: time="2026-01-14T01:32:16.649411585Z" level=info msg="CreateContainer within sandbox \"c8e8830ff37a3c2550983126a109b1927210e3215e2ce4f8006c0bd02ebd6563\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"4f6272efb5c5f549e0a57ec3c6ea3768280e6d04a437389be29e4791419ed936\"" Jan 14 01:32:16.650903 containerd[1684]: time="2026-01-14T01:32:16.649994929Z" level=info msg="StartContainer for \"4f6272efb5c5f549e0a57ec3c6ea3768280e6d04a437389be29e4791419ed936\"" Jan 14 01:32:16.652213 containerd[1684]: time="2026-01-14T01:32:16.652185703Z" level=info msg="connecting to shim 4f6272efb5c5f549e0a57ec3c6ea3768280e6d04a437389be29e4791419ed936" address="unix:///run/containerd/s/9afe881497127ae7ad0801ea13f2921bc7ec8dcbcf9a38e2816f34e6e826a3c8" protocol=ttrpc version=3 Jan 14 01:32:16.679043 systemd[1]: Started cri-containerd-4f6272efb5c5f549e0a57ec3c6ea3768280e6d04a437389be29e4791419ed936.scope - libcontainer container 4f6272efb5c5f549e0a57ec3c6ea3768280e6d04a437389be29e4791419ed936. Jan 14 01:32:16.718000 audit: BPF prog-id=169 op=LOAD Jan 14 01:32:16.720256 kernel: kauditd_printk_skb: 50 callbacks suppressed Jan 14 01:32:16.720303 kernel: audit: type=1334 audit(1768354336.718:564): prog-id=169 op=LOAD Jan 14 01:32:16.718000 audit[3639]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000138488 a2=98 a3=0 items=0 ppid=3428 pid=3639 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:16.723301 kernel: audit: type=1300 audit(1768354336.718:564): arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000138488 a2=98 a3=0 items=0 ppid=3428 pid=3639 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:16.718000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3466363237326566623563356635343965306135376563336336656133 Jan 14 01:32:16.727425 kernel: audit: type=1327 audit(1768354336.718:564): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3466363237326566623563356635343965306135376563336336656133 Jan 14 01:32:16.718000 audit: BPF prog-id=170 op=LOAD Jan 14 01:32:16.730426 kernel: audit: type=1334 audit(1768354336.718:565): prog-id=170 op=LOAD Jan 14 01:32:16.718000 audit[3639]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000138218 a2=98 a3=0 items=0 ppid=3428 pid=3639 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:16.733510 kernel: audit: type=1300 audit(1768354336.718:565): arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000138218 a2=98 a3=0 items=0 ppid=3428 pid=3639 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:16.718000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3466363237326566623563356635343965306135376563336336656133 Jan 14 01:32:16.718000 audit: BPF prog-id=170 op=UNLOAD Jan 14 01:32:16.741554 kernel: audit: type=1327 audit(1768354336.718:565): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3466363237326566623563356635343965306135376563336336656133 Jan 14 01:32:16.741602 kernel: audit: type=1334 audit(1768354336.718:566): prog-id=170 op=UNLOAD Jan 14 01:32:16.741621 kernel: audit: type=1300 audit(1768354336.718:566): arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3428 pid=3639 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:16.718000 audit[3639]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3428 pid=3639 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:16.718000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3466363237326566623563356635343965306135376563336336656133 Jan 14 01:32:16.747735 kernel: audit: type=1327 audit(1768354336.718:566): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3466363237326566623563356635343965306135376563336336656133 Jan 14 01:32:16.718000 audit: BPF prog-id=169 op=UNLOAD Jan 14 01:32:16.750917 kernel: audit: type=1334 audit(1768354336.718:567): prog-id=169 op=UNLOAD Jan 14 01:32:16.718000 audit[3639]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3428 pid=3639 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:16.718000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3466363237326566623563356635343965306135376563336336656133 Jan 14 01:32:16.718000 audit: BPF prog-id=171 op=LOAD Jan 14 01:32:16.718000 audit[3639]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001386e8 a2=98 a3=0 items=0 ppid=3428 pid=3639 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:16.718000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3466363237326566623563356635343965306135376563336336656133 Jan 14 01:32:16.761426 containerd[1684]: time="2026-01-14T01:32:16.761399252Z" level=info msg="StartContainer for \"4f6272efb5c5f549e0a57ec3c6ea3768280e6d04a437389be29e4791419ed936\" returns successfully" Jan 14 01:32:17.768506 kubelet[2924]: E0114 01:32:17.768426 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-bfb5m" podUID="e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2" Jan 14 01:32:18.066220 containerd[1684]: time="2026-01-14T01:32:18.066104866Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 14 01:32:18.068266 systemd[1]: cri-containerd-4f6272efb5c5f549e0a57ec3c6ea3768280e6d04a437389be29e4791419ed936.scope: Deactivated successfully. Jan 14 01:32:18.068542 systemd[1]: cri-containerd-4f6272efb5c5f549e0a57ec3c6ea3768280e6d04a437389be29e4791419ed936.scope: Consumed 477ms CPU time, 196.4M memory peak, 171.3M written to disk. Jan 14 01:32:18.070687 containerd[1684]: time="2026-01-14T01:32:18.070625162Z" level=info msg="received container exit event container_id:\"4f6272efb5c5f549e0a57ec3c6ea3768280e6d04a437389be29e4791419ed936\" id:\"4f6272efb5c5f549e0a57ec3c6ea3768280e6d04a437389be29e4791419ed936\" pid:3652 exited_at:{seconds:1768354338 nanos:70323083}" Jan 14 01:32:18.071000 audit: BPF prog-id=171 op=UNLOAD Jan 14 01:32:18.091647 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-4f6272efb5c5f549e0a57ec3c6ea3768280e6d04a437389be29e4791419ed936-rootfs.mount: Deactivated successfully. Jan 14 01:32:18.151701 kubelet[2924]: I0114 01:32:18.151648 2924 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Jan 14 01:32:18.185361 systemd[1]: Created slice kubepods-besteffort-pod8e676668_dd2e_4394_a5c6_387e536f2fb9.slice - libcontainer container kubepods-besteffort-pod8e676668_dd2e_4394_a5c6_387e536f2fb9.slice. Jan 14 01:32:18.194909 systemd[1]: Created slice kubepods-burstable-pode4543e16_4927_4a0f_aab5_23e9d79b2579.slice - libcontainer container kubepods-burstable-pode4543e16_4927_4a0f_aab5_23e9d79b2579.slice. Jan 14 01:32:18.215795 systemd[1]: Created slice kubepods-besteffort-pod327b5d4f_3ff4_43f6_98c0_8969f3f3f2d2.slice - libcontainer container kubepods-besteffort-pod327b5d4f_3ff4_43f6_98c0_8969f3f3f2d2.slice. Jan 14 01:32:18.224885 systemd[1]: Created slice kubepods-besteffort-poda4130f46_1c6e_474c_9a1c_5fd1820934c0.slice - libcontainer container kubepods-besteffort-poda4130f46_1c6e_474c_9a1c_5fd1820934c0.slice. Jan 14 01:32:18.231391 systemd[1]: Created slice kubepods-burstable-podc366ef83_2b27_4825_8b26_f67f058bea78.slice - libcontainer container kubepods-burstable-podc366ef83_2b27_4825_8b26_f67f058bea78.slice. Jan 14 01:32:18.237481 systemd[1]: Created slice kubepods-besteffort-pod7534bfa4_9af8_4640_8306_673448a61bb0.slice - libcontainer container kubepods-besteffort-pod7534bfa4_9af8_4640_8306_673448a61bb0.slice. Jan 14 01:32:18.245365 systemd[1]: Created slice kubepods-besteffort-pod9e0350f1_074c_4801_8433_6b63afe081c2.slice - libcontainer container kubepods-besteffort-pod9e0350f1_074c_4801_8433_6b63afe081c2.slice. Jan 14 01:32:18.265381 kubelet[2924]: I0114 01:32:18.265341 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e4543e16-4927-4a0f-aab5-23e9d79b2579-config-volume\") pod \"coredns-668d6bf9bc-lkwn9\" (UID: \"e4543e16-4927-4a0f-aab5-23e9d79b2579\") " pod="kube-system/coredns-668d6bf9bc-lkwn9" Jan 14 01:32:18.265831 kubelet[2924]: I0114 01:32:18.265760 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jdx6\" (UniqueName: \"kubernetes.io/projected/a4130f46-1c6e-474c-9a1c-5fd1820934c0-kube-api-access-5jdx6\") pod \"calico-apiserver-5f6c79b459-qjrqf\" (UID: \"a4130f46-1c6e-474c-9a1c-5fd1820934c0\") " pod="calico-apiserver/calico-apiserver-5f6c79b459-qjrqf" Jan 14 01:32:18.265831 kubelet[2924]: I0114 01:32:18.265789 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e0350f1-074c-4801-8433-6b63afe081c2-config\") pod \"goldmane-666569f655-gknfk\" (UID: \"9e0350f1-074c-4801-8433-6b63afe081c2\") " pod="calico-system/goldmane-666569f655-gknfk" Jan 14 01:32:18.266165 kubelet[2924]: I0114 01:32:18.266151 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pd8ps\" (UniqueName: \"kubernetes.io/projected/9e0350f1-074c-4801-8433-6b63afe081c2-kube-api-access-pd8ps\") pod \"goldmane-666569f655-gknfk\" (UID: \"9e0350f1-074c-4801-8433-6b63afe081c2\") " pod="calico-system/goldmane-666569f655-gknfk" Jan 14 01:32:18.266392 kubelet[2924]: I0114 01:32:18.266358 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c366ef83-2b27-4825-8b26-f67f058bea78-config-volume\") pod \"coredns-668d6bf9bc-kbxbb\" (UID: \"c366ef83-2b27-4825-8b26-f67f058bea78\") " pod="kube-system/coredns-668d6bf9bc-kbxbb" Jan 14 01:32:18.266926 kubelet[2924]: I0114 01:32:18.266378 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7742\" (UniqueName: \"kubernetes.io/projected/7534bfa4-9af8-4640-8306-673448a61bb0-kube-api-access-l7742\") pod \"calico-kube-controllers-84b7d46b9c-rw586\" (UID: \"7534bfa4-9af8-4640-8306-673448a61bb0\") " pod="calico-system/calico-kube-controllers-84b7d46b9c-rw586" Jan 14 01:32:18.266926 kubelet[2924]: I0114 01:32:18.266762 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdmck\" (UniqueName: \"kubernetes.io/projected/8e676668-dd2e-4394-a5c6-387e536f2fb9-kube-api-access-qdmck\") pod \"whisker-847d9f6f58-xxzwb\" (UID: \"8e676668-dd2e-4394-a5c6-387e536f2fb9\") " pod="calico-system/whisker-847d9f6f58-xxzwb" Jan 14 01:32:18.266926 kubelet[2924]: I0114 01:32:18.266782 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/9e0350f1-074c-4801-8433-6b63afe081c2-goldmane-key-pair\") pod \"goldmane-666569f655-gknfk\" (UID: \"9e0350f1-074c-4801-8433-6b63afe081c2\") " pod="calico-system/goldmane-666569f655-gknfk" Jan 14 01:32:18.266926 kubelet[2924]: I0114 01:32:18.266798 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmnkd\" (UniqueName: \"kubernetes.io/projected/327b5d4f-3ff4-43f6-98c0-8969f3f3f2d2-kube-api-access-fmnkd\") pod \"calico-apiserver-5f6c79b459-7rtbs\" (UID: \"327b5d4f-3ff4-43f6-98c0-8969f3f3f2d2\") " pod="calico-apiserver/calico-apiserver-5f6c79b459-7rtbs" Jan 14 01:32:18.266926 kubelet[2924]: I0114 01:32:18.266815 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/8e676668-dd2e-4394-a5c6-387e536f2fb9-whisker-backend-key-pair\") pod \"whisker-847d9f6f58-xxzwb\" (UID: \"8e676668-dd2e-4394-a5c6-387e536f2fb9\") " pod="calico-system/whisker-847d9f6f58-xxzwb" Jan 14 01:32:18.267073 kubelet[2924]: I0114 01:32:18.266830 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/a4130f46-1c6e-474c-9a1c-5fd1820934c0-calico-apiserver-certs\") pod \"calico-apiserver-5f6c79b459-qjrqf\" (UID: \"a4130f46-1c6e-474c-9a1c-5fd1820934c0\") " pod="calico-apiserver/calico-apiserver-5f6c79b459-qjrqf" Jan 14 01:32:18.267288 kubelet[2924]: I0114 01:32:18.267275 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6llfh\" (UniqueName: \"kubernetes.io/projected/c366ef83-2b27-4825-8b26-f67f058bea78-kube-api-access-6llfh\") pod \"coredns-668d6bf9bc-kbxbb\" (UID: \"c366ef83-2b27-4825-8b26-f67f058bea78\") " pod="kube-system/coredns-668d6bf9bc-kbxbb" Jan 14 01:32:18.267412 kubelet[2924]: I0114 01:32:18.267358 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7534bfa4-9af8-4640-8306-673448a61bb0-tigera-ca-bundle\") pod \"calico-kube-controllers-84b7d46b9c-rw586\" (UID: \"7534bfa4-9af8-4640-8306-673448a61bb0\") " pod="calico-system/calico-kube-controllers-84b7d46b9c-rw586" Jan 14 01:32:18.267560 kubelet[2924]: I0114 01:32:18.267471 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/327b5d4f-3ff4-43f6-98c0-8969f3f3f2d2-calico-apiserver-certs\") pod \"calico-apiserver-5f6c79b459-7rtbs\" (UID: \"327b5d4f-3ff4-43f6-98c0-8969f3f3f2d2\") " pod="calico-apiserver/calico-apiserver-5f6c79b459-7rtbs" Jan 14 01:32:18.267634 kubelet[2924]: I0114 01:32:18.267624 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e676668-dd2e-4394-a5c6-387e536f2fb9-whisker-ca-bundle\") pod \"whisker-847d9f6f58-xxzwb\" (UID: \"8e676668-dd2e-4394-a5c6-387e536f2fb9\") " pod="calico-system/whisker-847d9f6f58-xxzwb" Jan 14 01:32:18.267724 kubelet[2924]: I0114 01:32:18.267681 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e0350f1-074c-4801-8433-6b63afe081c2-goldmane-ca-bundle\") pod \"goldmane-666569f655-gknfk\" (UID: \"9e0350f1-074c-4801-8433-6b63afe081c2\") " pod="calico-system/goldmane-666569f655-gknfk" Jan 14 01:32:18.267928 kubelet[2924]: I0114 01:32:18.267826 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhgxg\" (UniqueName: \"kubernetes.io/projected/e4543e16-4927-4a0f-aab5-23e9d79b2579-kube-api-access-mhgxg\") pod \"coredns-668d6bf9bc-lkwn9\" (UID: \"e4543e16-4927-4a0f-aab5-23e9d79b2579\") " pod="kube-system/coredns-668d6bf9bc-lkwn9" Jan 14 01:32:18.492404 containerd[1684]: time="2026-01-14T01:32:18.491874809Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-847d9f6f58-xxzwb,Uid:8e676668-dd2e-4394-a5c6-387e536f2fb9,Namespace:calico-system,Attempt:0,}" Jan 14 01:32:18.507688 containerd[1684]: time="2026-01-14T01:32:18.507549863Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-lkwn9,Uid:e4543e16-4927-4a0f-aab5-23e9d79b2579,Namespace:kube-system,Attempt:0,}" Jan 14 01:32:18.521597 containerd[1684]: time="2026-01-14T01:32:18.521559781Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5f6c79b459-7rtbs,Uid:327b5d4f-3ff4-43f6-98c0-8969f3f3f2d2,Namespace:calico-apiserver,Attempt:0,}" Jan 14 01:32:18.528450 containerd[1684]: time="2026-01-14T01:32:18.528409254Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5f6c79b459-qjrqf,Uid:a4130f46-1c6e-474c-9a1c-5fd1820934c0,Namespace:calico-apiserver,Attempt:0,}" Jan 14 01:32:18.534982 containerd[1684]: time="2026-01-14T01:32:18.534937180Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-kbxbb,Uid:c366ef83-2b27-4825-8b26-f67f058bea78,Namespace:kube-system,Attempt:0,}" Jan 14 01:32:18.543601 containerd[1684]: time="2026-01-14T01:32:18.543537481Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-84b7d46b9c-rw586,Uid:7534bfa4-9af8-4640-8306-673448a61bb0,Namespace:calico-system,Attempt:0,}" Jan 14 01:32:18.825072 containerd[1684]: time="2026-01-14T01:32:18.824353102Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-gknfk,Uid:9e0350f1-074c-4801-8433-6b63afe081c2,Namespace:calico-system,Attempt:0,}" Jan 14 01:32:19.754992 containerd[1684]: time="2026-01-14T01:32:19.754690972Z" level=error msg="Failed to destroy network for sandbox \"01b2d9b40c923705deeae50c5c35257bac79a51538c64647e69193fc5658d3c7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:32:19.762179 containerd[1684]: time="2026-01-14T01:32:19.761978580Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5f6c79b459-7rtbs,Uid:327b5d4f-3ff4-43f6-98c0-8969f3f3f2d2,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"01b2d9b40c923705deeae50c5c35257bac79a51538c64647e69193fc5658d3c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:32:19.762332 kubelet[2924]: E0114 01:32:19.762281 2924 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"01b2d9b40c923705deeae50c5c35257bac79a51538c64647e69193fc5658d3c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:32:19.762597 kubelet[2924]: E0114 01:32:19.762351 2924 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"01b2d9b40c923705deeae50c5c35257bac79a51538c64647e69193fc5658d3c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5f6c79b459-7rtbs" Jan 14 01:32:19.762597 kubelet[2924]: E0114 01:32:19.762372 2924 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"01b2d9b40c923705deeae50c5c35257bac79a51538c64647e69193fc5658d3c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5f6c79b459-7rtbs" Jan 14 01:32:19.762597 kubelet[2924]: E0114 01:32:19.762411 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5f6c79b459-7rtbs_calico-apiserver(327b5d4f-3ff4-43f6-98c0-8969f3f3f2d2)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5f6c79b459-7rtbs_calico-apiserver(327b5d4f-3ff4-43f6-98c0-8969f3f3f2d2)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"01b2d9b40c923705deeae50c5c35257bac79a51538c64647e69193fc5658d3c7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-7rtbs" podUID="327b5d4f-3ff4-43f6-98c0-8969f3f3f2d2" Jan 14 01:32:19.774308 systemd[1]: Created slice kubepods-besteffort-pode9f3a916_9cf2_4ab2_9a2c_762b6410a5d2.slice - libcontainer container kubepods-besteffort-pode9f3a916_9cf2_4ab2_9a2c_762b6410a5d2.slice. Jan 14 01:32:19.777175 containerd[1684]: time="2026-01-14T01:32:19.777143883Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-bfb5m,Uid:e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2,Namespace:calico-system,Attempt:0,}" Jan 14 01:32:19.790929 containerd[1684]: time="2026-01-14T01:32:19.790892579Z" level=error msg="Failed to destroy network for sandbox \"ca01a0e69e68a538d26e077bb97031c560f5168a5813fc9c09a55d09d4c2ad1f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:32:19.794232 containerd[1684]: time="2026-01-14T01:32:19.794185439Z" level=error msg="Failed to destroy network for sandbox \"9b941d323e75096ec82a62759495fcf356a60d38d41b588e89597191b90de4eb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:32:19.794750 containerd[1684]: time="2026-01-14T01:32:19.794712789Z" level=error msg="Failed to destroy network for sandbox \"3fc63a505c4b7fc67d293cf8691e10def1a898b4b4985590f90db0d3edb4cf63\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:32:19.796074 containerd[1684]: time="2026-01-14T01:32:19.796018257Z" level=error msg="Failed to destroy network for sandbox \"a0391dc07595c62f628b35cf44960b126222a1810f455fd073d7c7842269b455\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:32:19.797088 containerd[1684]: time="2026-01-14T01:32:19.797063036Z" level=error msg="Failed to destroy network for sandbox \"ad6f926cc055985fc2a99c3c6a13b0def5477b9efc8140c51d25d763c7dd361c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:32:19.797244 containerd[1684]: time="2026-01-14T01:32:19.797176542Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-gknfk,Uid:9e0350f1-074c-4801-8433-6b63afe081c2,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ca01a0e69e68a538d26e077bb97031c560f5168a5813fc9c09a55d09d4c2ad1f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:32:19.797856 kubelet[2924]: E0114 01:32:19.797512 2924 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ca01a0e69e68a538d26e077bb97031c560f5168a5813fc9c09a55d09d4c2ad1f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:32:19.797856 kubelet[2924]: E0114 01:32:19.797764 2924 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ca01a0e69e68a538d26e077bb97031c560f5168a5813fc9c09a55d09d4c2ad1f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-gknfk" Jan 14 01:32:19.797856 kubelet[2924]: E0114 01:32:19.797788 2924 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ca01a0e69e68a538d26e077bb97031c560f5168a5813fc9c09a55d09d4c2ad1f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-gknfk" Jan 14 01:32:19.797974 kubelet[2924]: E0114 01:32:19.797825 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-666569f655-gknfk_calico-system(9e0350f1-074c-4801-8433-6b63afe081c2)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-666569f655-gknfk_calico-system(9e0350f1-074c-4801-8433-6b63afe081c2)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ca01a0e69e68a538d26e077bb97031c560f5168a5813fc9c09a55d09d4c2ad1f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-666569f655-gknfk" podUID="9e0350f1-074c-4801-8433-6b63afe081c2" Jan 14 01:32:19.802628 containerd[1684]: time="2026-01-14T01:32:19.801666933Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-847d9f6f58-xxzwb,Uid:8e676668-dd2e-4394-a5c6-387e536f2fb9,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9b941d323e75096ec82a62759495fcf356a60d38d41b588e89597191b90de4eb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:32:19.802758 kubelet[2924]: E0114 01:32:19.802543 2924 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9b941d323e75096ec82a62759495fcf356a60d38d41b588e89597191b90de4eb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:32:19.802808 kubelet[2924]: E0114 01:32:19.802773 2924 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9b941d323e75096ec82a62759495fcf356a60d38d41b588e89597191b90de4eb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-847d9f6f58-xxzwb" Jan 14 01:32:19.802808 kubelet[2924]: E0114 01:32:19.802792 2924 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9b941d323e75096ec82a62759495fcf356a60d38d41b588e89597191b90de4eb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-847d9f6f58-xxzwb" Jan 14 01:32:19.802862 kubelet[2924]: E0114 01:32:19.802824 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-847d9f6f58-xxzwb_calico-system(8e676668-dd2e-4394-a5c6-387e536f2fb9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-847d9f6f58-xxzwb_calico-system(8e676668-dd2e-4394-a5c6-387e536f2fb9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9b941d323e75096ec82a62759495fcf356a60d38d41b588e89597191b90de4eb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-847d9f6f58-xxzwb" podUID="8e676668-dd2e-4394-a5c6-387e536f2fb9" Jan 14 01:32:19.807512 containerd[1684]: time="2026-01-14T01:32:19.807473982Z" level=error msg="Failed to destroy network for sandbox \"0b1de6202ef8c0e676941006177795574360b3eab676caf812db998c5fdc22f4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:32:19.811329 containerd[1684]: time="2026-01-14T01:32:19.811296260Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-lkwn9,Uid:e4543e16-4927-4a0f-aab5-23e9d79b2579,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"3fc63a505c4b7fc67d293cf8691e10def1a898b4b4985590f90db0d3edb4cf63\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:32:19.812129 kubelet[2924]: E0114 01:32:19.811437 2924 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3fc63a505c4b7fc67d293cf8691e10def1a898b4b4985590f90db0d3edb4cf63\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:32:19.812129 kubelet[2924]: E0114 01:32:19.811472 2924 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3fc63a505c4b7fc67d293cf8691e10def1a898b4b4985590f90db0d3edb4cf63\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-lkwn9" Jan 14 01:32:19.812129 kubelet[2924]: E0114 01:32:19.811490 2924 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3fc63a505c4b7fc67d293cf8691e10def1a898b4b4985590f90db0d3edb4cf63\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-lkwn9" Jan 14 01:32:19.812240 kubelet[2924]: E0114 01:32:19.811520 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-lkwn9_kube-system(e4543e16-4927-4a0f-aab5-23e9d79b2579)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-lkwn9_kube-system(e4543e16-4927-4a0f-aab5-23e9d79b2579)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3fc63a505c4b7fc67d293cf8691e10def1a898b4b4985590f90db0d3edb4cf63\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-lkwn9" podUID="e4543e16-4927-4a0f-aab5-23e9d79b2579" Jan 14 01:32:19.813638 containerd[1684]: time="2026-01-14T01:32:19.812860467Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5f6c79b459-qjrqf,Uid:a4130f46-1c6e-474c-9a1c-5fd1820934c0,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a0391dc07595c62f628b35cf44960b126222a1810f455fd073d7c7842269b455\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:32:19.813723 kubelet[2924]: E0114 01:32:19.813105 2924 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a0391dc07595c62f628b35cf44960b126222a1810f455fd073d7c7842269b455\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:32:19.813723 kubelet[2924]: E0114 01:32:19.813132 2924 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a0391dc07595c62f628b35cf44960b126222a1810f455fd073d7c7842269b455\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5f6c79b459-qjrqf" Jan 14 01:32:19.813723 kubelet[2924]: E0114 01:32:19.813149 2924 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a0391dc07595c62f628b35cf44960b126222a1810f455fd073d7c7842269b455\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5f6c79b459-qjrqf" Jan 14 01:32:19.813805 kubelet[2924]: E0114 01:32:19.813176 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5f6c79b459-qjrqf_calico-apiserver(a4130f46-1c6e-474c-9a1c-5fd1820934c0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5f6c79b459-qjrqf_calico-apiserver(a4130f46-1c6e-474c-9a1c-5fd1820934c0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a0391dc07595c62f628b35cf44960b126222a1810f455fd073d7c7842269b455\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-qjrqf" podUID="a4130f46-1c6e-474c-9a1c-5fd1820934c0" Jan 14 01:32:19.815762 containerd[1684]: time="2026-01-14T01:32:19.815732462Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-84b7d46b9c-rw586,Uid:7534bfa4-9af8-4640-8306-673448a61bb0,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ad6f926cc055985fc2a99c3c6a13b0def5477b9efc8140c51d25d763c7dd361c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:32:19.815927 kubelet[2924]: E0114 01:32:19.815884 2924 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ad6f926cc055985fc2a99c3c6a13b0def5477b9efc8140c51d25d763c7dd361c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:32:19.815927 kubelet[2924]: E0114 01:32:19.815914 2924 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ad6f926cc055985fc2a99c3c6a13b0def5477b9efc8140c51d25d763c7dd361c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-84b7d46b9c-rw586" Jan 14 01:32:19.816893 kubelet[2924]: E0114 01:32:19.815930 2924 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ad6f926cc055985fc2a99c3c6a13b0def5477b9efc8140c51d25d763c7dd361c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-84b7d46b9c-rw586" Jan 14 01:32:19.816893 kubelet[2924]: E0114 01:32:19.815958 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-84b7d46b9c-rw586_calico-system(7534bfa4-9af8-4640-8306-673448a61bb0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-84b7d46b9c-rw586_calico-system(7534bfa4-9af8-4640-8306-673448a61bb0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ad6f926cc055985fc2a99c3c6a13b0def5477b9efc8140c51d25d763c7dd361c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-84b7d46b9c-rw586" podUID="7534bfa4-9af8-4640-8306-673448a61bb0" Jan 14 01:32:19.817147 containerd[1684]: time="2026-01-14T01:32:19.817112489Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-kbxbb,Uid:c366ef83-2b27-4825-8b26-f67f058bea78,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"0b1de6202ef8c0e676941006177795574360b3eab676caf812db998c5fdc22f4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:32:19.819819 kubelet[2924]: E0114 01:32:19.819018 2924 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0b1de6202ef8c0e676941006177795574360b3eab676caf812db998c5fdc22f4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:32:19.820036 kubelet[2924]: E0114 01:32:19.819993 2924 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0b1de6202ef8c0e676941006177795574360b3eab676caf812db998c5fdc22f4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-kbxbb" Jan 14 01:32:19.820036 kubelet[2924]: E0114 01:32:19.820017 2924 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0b1de6202ef8c0e676941006177795574360b3eab676caf812db998c5fdc22f4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-kbxbb" Jan 14 01:32:19.820151 kubelet[2924]: E0114 01:32:19.820131 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-kbxbb_kube-system(c366ef83-2b27-4825-8b26-f67f058bea78)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-kbxbb_kube-system(c366ef83-2b27-4825-8b26-f67f058bea78)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0b1de6202ef8c0e676941006177795574360b3eab676caf812db998c5fdc22f4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-kbxbb" podUID="c366ef83-2b27-4825-8b26-f67f058bea78" Jan 14 01:32:19.841912 containerd[1684]: time="2026-01-14T01:32:19.841876430Z" level=error msg="Failed to destroy network for sandbox \"0b59f5b1b889d1962b09a933c96daf65cff266881b8c6fc53c6191494fcdd504\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:32:19.845488 containerd[1684]: time="2026-01-14T01:32:19.845453907Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-bfb5m,Uid:e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"0b59f5b1b889d1962b09a933c96daf65cff266881b8c6fc53c6191494fcdd504\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:32:19.845809 kubelet[2924]: E0114 01:32:19.845776 2924 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0b59f5b1b889d1962b09a933c96daf65cff266881b8c6fc53c6191494fcdd504\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:32:19.845881 kubelet[2924]: E0114 01:32:19.845826 2924 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0b59f5b1b889d1962b09a933c96daf65cff266881b8c6fc53c6191494fcdd504\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-bfb5m" Jan 14 01:32:19.845881 kubelet[2924]: E0114 01:32:19.845859 2924 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0b59f5b1b889d1962b09a933c96daf65cff266881b8c6fc53c6191494fcdd504\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-bfb5m" Jan 14 01:32:19.845966 kubelet[2924]: E0114 01:32:19.845902 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-bfb5m_calico-system(e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-bfb5m_calico-system(e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0b59f5b1b889d1962b09a933c96daf65cff266881b8c6fc53c6191494fcdd504\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-bfb5m" podUID="e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2" Jan 14 01:32:19.864909 containerd[1684]: time="2026-01-14T01:32:19.864876321Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\"" Jan 14 01:32:20.092084 systemd[1]: run-netns-cni\x2d9f9fbe5b\x2d9f69\x2d164e\x2d4e2a\x2dea3aecb92fe1.mount: Deactivated successfully. Jan 14 01:32:20.092177 systemd[1]: run-netns-cni\x2d511a8a58\x2daa76\x2d930d\x2d176d\x2d0b270d8ff83e.mount: Deactivated successfully. Jan 14 01:32:20.092248 systemd[1]: run-netns-cni\x2d7d05814f\x2d9ab0\x2df0fc\x2d0595\x2df3cddd055aac.mount: Deactivated successfully. Jan 14 01:32:20.092295 systemd[1]: run-netns-cni\x2d50c17a85\x2d4ff6\x2d9448\x2d74bc\x2dd5cc8cb2138b.mount: Deactivated successfully. Jan 14 01:32:27.673256 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1805111620.mount: Deactivated successfully. Jan 14 01:32:27.699260 containerd[1684]: time="2026-01-14T01:32:27.699204324Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:32:27.701378 containerd[1684]: time="2026-01-14T01:32:27.701341548Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.4: active requests=0, bytes read=156880025" Jan 14 01:32:27.702770 containerd[1684]: time="2026-01-14T01:32:27.702713277Z" level=info msg="ImageCreate event name:\"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:32:27.705835 containerd[1684]: time="2026-01-14T01:32:27.705799930Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:32:27.706544 containerd[1684]: time="2026-01-14T01:32:27.706519002Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.4\" with image id \"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\", size \"156883537\" in 7.84159896s" Jan 14 01:32:27.706575 containerd[1684]: time="2026-01-14T01:32:27.706546907Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\" returns image reference \"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\"" Jan 14 01:32:27.719298 containerd[1684]: time="2026-01-14T01:32:27.719268236Z" level=info msg="CreateContainer within sandbox \"c8e8830ff37a3c2550983126a109b1927210e3215e2ce4f8006c0bd02ebd6563\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Jan 14 01:32:27.740180 containerd[1684]: time="2026-01-14T01:32:27.736638226Z" level=info msg="Container 2549bd9b09cc1e1cf1a918b0a6d0f803d4b29f5c59b60e06bd28b783d0164f63: CDI devices from CRI Config.CDIDevices: []" Jan 14 01:32:27.752322 containerd[1684]: time="2026-01-14T01:32:27.752287443Z" level=info msg="CreateContainer within sandbox \"c8e8830ff37a3c2550983126a109b1927210e3215e2ce4f8006c0bd02ebd6563\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"2549bd9b09cc1e1cf1a918b0a6d0f803d4b29f5c59b60e06bd28b783d0164f63\"" Jan 14 01:32:27.753186 containerd[1684]: time="2026-01-14T01:32:27.753155781Z" level=info msg="StartContainer for \"2549bd9b09cc1e1cf1a918b0a6d0f803d4b29f5c59b60e06bd28b783d0164f63\"" Jan 14 01:32:27.754695 containerd[1684]: time="2026-01-14T01:32:27.754674048Z" level=info msg="connecting to shim 2549bd9b09cc1e1cf1a918b0a6d0f803d4b29f5c59b60e06bd28b783d0164f63" address="unix:///run/containerd/s/9afe881497127ae7ad0801ea13f2921bc7ec8dcbcf9a38e2816f34e6e826a3c8" protocol=ttrpc version=3 Jan 14 01:32:27.817067 systemd[1]: Started cri-containerd-2549bd9b09cc1e1cf1a918b0a6d0f803d4b29f5c59b60e06bd28b783d0164f63.scope - libcontainer container 2549bd9b09cc1e1cf1a918b0a6d0f803d4b29f5c59b60e06bd28b783d0164f63. Jan 14 01:32:27.883000 audit: BPF prog-id=172 op=LOAD Jan 14 01:32:27.885199 kernel: kauditd_printk_skb: 6 callbacks suppressed Jan 14 01:32:27.885268 kernel: audit: type=1334 audit(1768354347.883:570): prog-id=172 op=LOAD Jan 14 01:32:27.883000 audit[3910]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=3428 pid=3910 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:27.890231 kernel: audit: type=1300 audit(1768354347.883:570): arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=3428 pid=3910 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:27.883000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3235343962643962303963633165316366316139313862306136643066 Jan 14 01:32:27.896260 kernel: audit: type=1327 audit(1768354347.883:570): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3235343962643962303963633165316366316139313862306136643066 Jan 14 01:32:27.883000 audit: BPF prog-id=173 op=LOAD Jan 14 01:32:27.883000 audit[3910]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=3428 pid=3910 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:27.903945 kernel: audit: type=1334 audit(1768354347.883:571): prog-id=173 op=LOAD Jan 14 01:32:27.903994 kernel: audit: type=1300 audit(1768354347.883:571): arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=3428 pid=3910 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:27.883000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3235343962643962303963633165316366316139313862306136643066 Jan 14 01:32:27.909172 kernel: audit: type=1327 audit(1768354347.883:571): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3235343962643962303963633165316366316139313862306136643066 Jan 14 01:32:27.883000 audit: BPF prog-id=173 op=UNLOAD Jan 14 01:32:27.883000 audit[3910]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3428 pid=3910 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:27.915592 kernel: audit: type=1334 audit(1768354347.883:572): prog-id=173 op=UNLOAD Jan 14 01:32:27.915635 kernel: audit: type=1300 audit(1768354347.883:572): arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3428 pid=3910 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:27.883000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3235343962643962303963633165316366316139313862306136643066 Jan 14 01:32:27.919911 kernel: audit: type=1327 audit(1768354347.883:572): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3235343962643962303963633165316366316139313862306136643066 Jan 14 01:32:27.883000 audit: BPF prog-id=172 op=UNLOAD Jan 14 01:32:27.922936 kernel: audit: type=1334 audit(1768354347.883:573): prog-id=172 op=UNLOAD Jan 14 01:32:27.883000 audit[3910]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3428 pid=3910 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:27.883000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3235343962643962303963633165316366316139313862306136643066 Jan 14 01:32:27.883000 audit: BPF prog-id=174 op=LOAD Jan 14 01:32:27.883000 audit[3910]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=3428 pid=3910 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:27.883000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3235343962643962303963633165316366316139313862306136643066 Jan 14 01:32:27.931507 containerd[1684]: time="2026-01-14T01:32:27.931478227Z" level=info msg="StartContainer for \"2549bd9b09cc1e1cf1a918b0a6d0f803d4b29f5c59b60e06bd28b783d0164f63\" returns successfully" Jan 14 01:32:28.021410 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Jan 14 01:32:28.021502 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Jan 14 01:32:28.231920 kubelet[2924]: I0114 01:32:28.231741 2924 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdmck\" (UniqueName: \"kubernetes.io/projected/8e676668-dd2e-4394-a5c6-387e536f2fb9-kube-api-access-qdmck\") pod \"8e676668-dd2e-4394-a5c6-387e536f2fb9\" (UID: \"8e676668-dd2e-4394-a5c6-387e536f2fb9\") " Jan 14 01:32:28.231920 kubelet[2924]: I0114 01:32:28.231796 2924 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e676668-dd2e-4394-a5c6-387e536f2fb9-whisker-ca-bundle\") pod \"8e676668-dd2e-4394-a5c6-387e536f2fb9\" (UID: \"8e676668-dd2e-4394-a5c6-387e536f2fb9\") " Jan 14 01:32:28.231920 kubelet[2924]: I0114 01:32:28.231815 2924 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/8e676668-dd2e-4394-a5c6-387e536f2fb9-whisker-backend-key-pair\") pod \"8e676668-dd2e-4394-a5c6-387e536f2fb9\" (UID: \"8e676668-dd2e-4394-a5c6-387e536f2fb9\") " Jan 14 01:32:28.234249 kubelet[2924]: I0114 01:32:28.234216 2924 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e676668-dd2e-4394-a5c6-387e536f2fb9-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "8e676668-dd2e-4394-a5c6-387e536f2fb9" (UID: "8e676668-dd2e-4394-a5c6-387e536f2fb9"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Jan 14 01:32:28.238500 kubelet[2924]: I0114 01:32:28.238305 2924 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e676668-dd2e-4394-a5c6-387e536f2fb9-kube-api-access-qdmck" (OuterVolumeSpecName: "kube-api-access-qdmck") pod "8e676668-dd2e-4394-a5c6-387e536f2fb9" (UID: "8e676668-dd2e-4394-a5c6-387e536f2fb9"). InnerVolumeSpecName "kube-api-access-qdmck". PluginName "kubernetes.io/projected", VolumeGIDValue "" Jan 14 01:32:28.239092 kubelet[2924]: I0114 01:32:28.239066 2924 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e676668-dd2e-4394-a5c6-387e536f2fb9-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "8e676668-dd2e-4394-a5c6-387e536f2fb9" (UID: "8e676668-dd2e-4394-a5c6-387e536f2fb9"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Jan 14 01:32:28.335253 kubelet[2924]: I0114 01:32:28.334669 2924 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/8e676668-dd2e-4394-a5c6-387e536f2fb9-whisker-backend-key-pair\") on node \"ci-4578-0-0-p-557efd55ff\" DevicePath \"\"" Jan 14 01:32:28.335253 kubelet[2924]: I0114 01:32:28.334696 2924 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e676668-dd2e-4394-a5c6-387e536f2fb9-whisker-ca-bundle\") on node \"ci-4578-0-0-p-557efd55ff\" DevicePath \"\"" Jan 14 01:32:28.335253 kubelet[2924]: I0114 01:32:28.334705 2924 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qdmck\" (UniqueName: \"kubernetes.io/projected/8e676668-dd2e-4394-a5c6-387e536f2fb9-kube-api-access-qdmck\") on node \"ci-4578-0-0-p-557efd55ff\" DevicePath \"\"" Jan 14 01:32:28.671832 systemd[1]: var-lib-kubelet-pods-8e676668\x2ddd2e\x2d4394\x2da5c6\x2d387e536f2fb9-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dqdmck.mount: Deactivated successfully. Jan 14 01:32:28.671973 systemd[1]: var-lib-kubelet-pods-8e676668\x2ddd2e\x2d4394\x2da5c6\x2d387e536f2fb9-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Jan 14 01:32:28.779404 systemd[1]: Removed slice kubepods-besteffort-pod8e676668_dd2e_4394_a5c6_387e536f2fb9.slice - libcontainer container kubepods-besteffort-pod8e676668_dd2e_4394_a5c6_387e536f2fb9.slice. Jan 14 01:32:28.927515 kubelet[2924]: I0114 01:32:28.927029 2924 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-46xxr" podStartSLOduration=1.738927779 podStartE2EDuration="22.927008477s" podCreationTimestamp="2026-01-14 01:32:06 +0000 UTC" firstStartedPulling="2026-01-14 01:32:06.519059704 +0000 UTC m=+19.851288936" lastFinishedPulling="2026-01-14 01:32:27.707140403 +0000 UTC m=+41.039369634" observedRunningTime="2026-01-14 01:32:28.913082376 +0000 UTC m=+42.245311626" watchObservedRunningTime="2026-01-14 01:32:28.927008477 +0000 UTC m=+42.259237750" Jan 14 01:32:28.964018 systemd[1]: Created slice kubepods-besteffort-pod5694b728_96e4_405e_ad55_bbb10255a07e.slice - libcontainer container kubepods-besteffort-pod5694b728_96e4_405e_ad55_bbb10255a07e.slice. Jan 14 01:32:29.039312 kubelet[2924]: I0114 01:32:29.039268 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/5694b728-96e4-405e-ad55-bbb10255a07e-whisker-backend-key-pair\") pod \"whisker-fcdf56664-kjljc\" (UID: \"5694b728-96e4-405e-ad55-bbb10255a07e\") " pod="calico-system/whisker-fcdf56664-kjljc" Jan 14 01:32:29.039312 kubelet[2924]: I0114 01:32:29.039306 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5694b728-96e4-405e-ad55-bbb10255a07e-whisker-ca-bundle\") pod \"whisker-fcdf56664-kjljc\" (UID: \"5694b728-96e4-405e-ad55-bbb10255a07e\") " pod="calico-system/whisker-fcdf56664-kjljc" Jan 14 01:32:29.039460 kubelet[2924]: I0114 01:32:29.039327 2924 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqs2g\" (UniqueName: \"kubernetes.io/projected/5694b728-96e4-405e-ad55-bbb10255a07e-kube-api-access-fqs2g\") pod \"whisker-fcdf56664-kjljc\" (UID: \"5694b728-96e4-405e-ad55-bbb10255a07e\") " pod="calico-system/whisker-fcdf56664-kjljc" Jan 14 01:32:29.268997 containerd[1684]: time="2026-01-14T01:32:29.268934673Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-fcdf56664-kjljc,Uid:5694b728-96e4-405e-ad55-bbb10255a07e,Namespace:calico-system,Attempt:0,}" Jan 14 01:32:29.889966 systemd-networkd[1567]: cali95f41f79b8e: Link UP Jan 14 01:32:29.890212 systemd-networkd[1567]: cali95f41f79b8e: Gained carrier Jan 14 01:32:29.917212 containerd[1684]: 2026-01-14 01:32:29.318 [INFO][3998] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 14 01:32:29.917212 containerd[1684]: 2026-01-14 01:32:29.461 [INFO][3998] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4578--0--0--p--557efd55ff-k8s-whisker--fcdf56664--kjljc-eth0 whisker-fcdf56664- calico-system 5694b728-96e4-405e-ad55-bbb10255a07e 866 0 2026-01-14 01:32:28 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:fcdf56664 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4578-0-0-p-557efd55ff whisker-fcdf56664-kjljc eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali95f41f79b8e [] [] }} ContainerID="6ea3fd0294eb1ef145952b18df125a07a00845586070630015d1edf9f86315d5" Namespace="calico-system" Pod="whisker-fcdf56664-kjljc" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-whisker--fcdf56664--kjljc-" Jan 14 01:32:29.917212 containerd[1684]: 2026-01-14 01:32:29.461 [INFO][3998] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="6ea3fd0294eb1ef145952b18df125a07a00845586070630015d1edf9f86315d5" Namespace="calico-system" Pod="whisker-fcdf56664-kjljc" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-whisker--fcdf56664--kjljc-eth0" Jan 14 01:32:29.917212 containerd[1684]: 2026-01-14 01:32:29.509 [INFO][4095] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6ea3fd0294eb1ef145952b18df125a07a00845586070630015d1edf9f86315d5" HandleID="k8s-pod-network.6ea3fd0294eb1ef145952b18df125a07a00845586070630015d1edf9f86315d5" Workload="ci--4578--0--0--p--557efd55ff-k8s-whisker--fcdf56664--kjljc-eth0" Jan 14 01:32:29.917438 containerd[1684]: 2026-01-14 01:32:29.510 [INFO][4095] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="6ea3fd0294eb1ef145952b18df125a07a00845586070630015d1edf9f86315d5" HandleID="k8s-pod-network.6ea3fd0294eb1ef145952b18df125a07a00845586070630015d1edf9f86315d5" Workload="ci--4578--0--0--p--557efd55ff-k8s-whisker--fcdf56664--kjljc-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00035c0e0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4578-0-0-p-557efd55ff", "pod":"whisker-fcdf56664-kjljc", "timestamp":"2026-01-14 01:32:29.509206465 +0000 UTC"}, Hostname:"ci-4578-0-0-p-557efd55ff", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 14 01:32:29.917438 containerd[1684]: 2026-01-14 01:32:29.510 [INFO][4095] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 14 01:32:29.917438 containerd[1684]: 2026-01-14 01:32:29.511 [INFO][4095] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 14 01:32:29.917438 containerd[1684]: 2026-01-14 01:32:29.513 [INFO][4095] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4578-0-0-p-557efd55ff' Jan 14 01:32:29.917438 containerd[1684]: 2026-01-14 01:32:29.523 [INFO][4095] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.6ea3fd0294eb1ef145952b18df125a07a00845586070630015d1edf9f86315d5" host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:29.917438 containerd[1684]: 2026-01-14 01:32:29.529 [INFO][4095] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:29.917438 containerd[1684]: 2026-01-14 01:32:29.535 [INFO][4095] ipam/ipam.go 511: Trying affinity for 192.168.118.128/26 host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:29.917438 containerd[1684]: 2026-01-14 01:32:29.536 [INFO][4095] ipam/ipam.go 158: Attempting to load block cidr=192.168.118.128/26 host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:29.917438 containerd[1684]: 2026-01-14 01:32:29.538 [INFO][4095] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.118.128/26 host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:29.917634 containerd[1684]: 2026-01-14 01:32:29.538 [INFO][4095] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.118.128/26 handle="k8s-pod-network.6ea3fd0294eb1ef145952b18df125a07a00845586070630015d1edf9f86315d5" host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:29.917634 containerd[1684]: 2026-01-14 01:32:29.539 [INFO][4095] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.6ea3fd0294eb1ef145952b18df125a07a00845586070630015d1edf9f86315d5 Jan 14 01:32:29.917634 containerd[1684]: 2026-01-14 01:32:29.547 [INFO][4095] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.118.128/26 handle="k8s-pod-network.6ea3fd0294eb1ef145952b18df125a07a00845586070630015d1edf9f86315d5" host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:29.917634 containerd[1684]: 2026-01-14 01:32:29.551 [INFO][4095] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.118.129/26] block=192.168.118.128/26 handle="k8s-pod-network.6ea3fd0294eb1ef145952b18df125a07a00845586070630015d1edf9f86315d5" host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:29.917634 containerd[1684]: 2026-01-14 01:32:29.551 [INFO][4095] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.118.129/26] handle="k8s-pod-network.6ea3fd0294eb1ef145952b18df125a07a00845586070630015d1edf9f86315d5" host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:29.917634 containerd[1684]: 2026-01-14 01:32:29.551 [INFO][4095] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 14 01:32:29.917634 containerd[1684]: 2026-01-14 01:32:29.551 [INFO][4095] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.118.129/26] IPv6=[] ContainerID="6ea3fd0294eb1ef145952b18df125a07a00845586070630015d1edf9f86315d5" HandleID="k8s-pod-network.6ea3fd0294eb1ef145952b18df125a07a00845586070630015d1edf9f86315d5" Workload="ci--4578--0--0--p--557efd55ff-k8s-whisker--fcdf56664--kjljc-eth0" Jan 14 01:32:29.917766 containerd[1684]: 2026-01-14 01:32:29.555 [INFO][3998] cni-plugin/k8s.go 418: Populated endpoint ContainerID="6ea3fd0294eb1ef145952b18df125a07a00845586070630015d1edf9f86315d5" Namespace="calico-system" Pod="whisker-fcdf56664-kjljc" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-whisker--fcdf56664--kjljc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4578--0--0--p--557efd55ff-k8s-whisker--fcdf56664--kjljc-eth0", GenerateName:"whisker-fcdf56664-", Namespace:"calico-system", SelfLink:"", UID:"5694b728-96e4-405e-ad55-bbb10255a07e", ResourceVersion:"866", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 32, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"fcdf56664", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4578-0-0-p-557efd55ff", ContainerID:"", Pod:"whisker-fcdf56664-kjljc", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.118.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali95f41f79b8e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:32:29.917766 containerd[1684]: 2026-01-14 01:32:29.555 [INFO][3998] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.118.129/32] ContainerID="6ea3fd0294eb1ef145952b18df125a07a00845586070630015d1edf9f86315d5" Namespace="calico-system" Pod="whisker-fcdf56664-kjljc" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-whisker--fcdf56664--kjljc-eth0" Jan 14 01:32:29.917835 containerd[1684]: 2026-01-14 01:32:29.555 [INFO][3998] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali95f41f79b8e ContainerID="6ea3fd0294eb1ef145952b18df125a07a00845586070630015d1edf9f86315d5" Namespace="calico-system" Pod="whisker-fcdf56664-kjljc" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-whisker--fcdf56664--kjljc-eth0" Jan 14 01:32:29.917835 containerd[1684]: 2026-01-14 01:32:29.890 [INFO][3998] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6ea3fd0294eb1ef145952b18df125a07a00845586070630015d1edf9f86315d5" Namespace="calico-system" Pod="whisker-fcdf56664-kjljc" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-whisker--fcdf56664--kjljc-eth0" Jan 14 01:32:29.917889 containerd[1684]: 2026-01-14 01:32:29.891 [INFO][3998] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="6ea3fd0294eb1ef145952b18df125a07a00845586070630015d1edf9f86315d5" Namespace="calico-system" Pod="whisker-fcdf56664-kjljc" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-whisker--fcdf56664--kjljc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4578--0--0--p--557efd55ff-k8s-whisker--fcdf56664--kjljc-eth0", GenerateName:"whisker-fcdf56664-", Namespace:"calico-system", SelfLink:"", UID:"5694b728-96e4-405e-ad55-bbb10255a07e", ResourceVersion:"866", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 32, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"fcdf56664", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4578-0-0-p-557efd55ff", ContainerID:"6ea3fd0294eb1ef145952b18df125a07a00845586070630015d1edf9f86315d5", Pod:"whisker-fcdf56664-kjljc", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.118.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali95f41f79b8e", MAC:"7a:d7:5d:53:22:f5", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:32:29.917941 containerd[1684]: 2026-01-14 01:32:29.904 [INFO][3998] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="6ea3fd0294eb1ef145952b18df125a07a00845586070630015d1edf9f86315d5" Namespace="calico-system" Pod="whisker-fcdf56664-kjljc" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-whisker--fcdf56664--kjljc-eth0" Jan 14 01:32:30.012386 containerd[1684]: time="2026-01-14T01:32:30.011941066Z" level=info msg="connecting to shim 6ea3fd0294eb1ef145952b18df125a07a00845586070630015d1edf9f86315d5" address="unix:///run/containerd/s/709176a4020ae31e505095bed5711bd0a014a244d45249dfc36050fef05696fa" namespace=k8s.io protocol=ttrpc version=3 Jan 14 01:32:30.037368 systemd[1]: Started cri-containerd-6ea3fd0294eb1ef145952b18df125a07a00845586070630015d1edf9f86315d5.scope - libcontainer container 6ea3fd0294eb1ef145952b18df125a07a00845586070630015d1edf9f86315d5. Jan 14 01:32:30.045000 audit: BPF prog-id=175 op=LOAD Jan 14 01:32:30.046000 audit: BPF prog-id=176 op=LOAD Jan 14 01:32:30.046000 audit[4157]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=4146 pid=4157 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:30.046000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3665613366643032393465623165663134353935326231386466313235 Jan 14 01:32:30.046000 audit: BPF prog-id=176 op=UNLOAD Jan 14 01:32:30.046000 audit[4157]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4146 pid=4157 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:30.046000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3665613366643032393465623165663134353935326231386466313235 Jan 14 01:32:30.046000 audit: BPF prog-id=177 op=LOAD Jan 14 01:32:30.046000 audit[4157]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=4146 pid=4157 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:30.046000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3665613366643032393465623165663134353935326231386466313235 Jan 14 01:32:30.046000 audit: BPF prog-id=178 op=LOAD Jan 14 01:32:30.046000 audit[4157]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=4146 pid=4157 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:30.046000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3665613366643032393465623165663134353935326231386466313235 Jan 14 01:32:30.046000 audit: BPF prog-id=178 op=UNLOAD Jan 14 01:32:30.046000 audit[4157]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4146 pid=4157 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:30.046000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3665613366643032393465623165663134353935326231386466313235 Jan 14 01:32:30.046000 audit: BPF prog-id=177 op=UNLOAD Jan 14 01:32:30.046000 audit[4157]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4146 pid=4157 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:30.046000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3665613366643032393465623165663134353935326231386466313235 Jan 14 01:32:30.046000 audit: BPF prog-id=179 op=LOAD Jan 14 01:32:30.046000 audit[4157]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=4146 pid=4157 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:30.046000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3665613366643032393465623165663134353935326231386466313235 Jan 14 01:32:30.082994 containerd[1684]: time="2026-01-14T01:32:30.082945829Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-fcdf56664-kjljc,Uid:5694b728-96e4-405e-ad55-bbb10255a07e,Namespace:calico-system,Attempt:0,} returns sandbox id \"6ea3fd0294eb1ef145952b18df125a07a00845586070630015d1edf9f86315d5\"" Jan 14 01:32:30.084601 containerd[1684]: time="2026-01-14T01:32:30.084572199Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 14 01:32:30.421866 containerd[1684]: time="2026-01-14T01:32:30.420881588Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:32:30.422746 containerd[1684]: time="2026-01-14T01:32:30.422709425Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 14 01:32:30.422818 containerd[1684]: time="2026-01-14T01:32:30.422800303Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 14 01:32:30.423292 kubelet[2924]: E0114 01:32:30.423003 2924 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 14 01:32:30.423292 kubelet[2924]: E0114 01:32:30.423056 2924 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 14 01:32:30.434672 kubelet[2924]: E0114 01:32:30.434605 2924 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:a26f849dc23e45ddb5523da6bae47563,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fqs2g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-fcdf56664-kjljc_calico-system(5694b728-96e4-405e-ad55-bbb10255a07e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 14 01:32:30.437288 containerd[1684]: time="2026-01-14T01:32:30.437114557Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 14 01:32:30.745807 containerd[1684]: time="2026-01-14T01:32:30.745717145Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:32:30.747860 containerd[1684]: time="2026-01-14T01:32:30.747805177Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 14 01:32:30.748045 containerd[1684]: time="2026-01-14T01:32:30.747892186Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 14 01:32:30.748140 kubelet[2924]: E0114 01:32:30.748110 2924 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 14 01:32:30.748333 kubelet[2924]: E0114 01:32:30.748187 2924 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 14 01:32:30.748555 kubelet[2924]: E0114 01:32:30.748408 2924 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fqs2g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-fcdf56664-kjljc_calico-system(5694b728-96e4-405e-ad55-bbb10255a07e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 14 01:32:30.749694 kubelet[2924]: E0114 01:32:30.749661 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-fcdf56664-kjljc" podUID="5694b728-96e4-405e-ad55-bbb10255a07e" Jan 14 01:32:30.769706 containerd[1684]: time="2026-01-14T01:32:30.769377952Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5f6c79b459-qjrqf,Uid:a4130f46-1c6e-474c-9a1c-5fd1820934c0,Namespace:calico-apiserver,Attempt:0,}" Jan 14 01:32:30.772876 kubelet[2924]: I0114 01:32:30.771421 2924 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e676668-dd2e-4394-a5c6-387e536f2fb9" path="/var/lib/kubelet/pods/8e676668-dd2e-4394-a5c6-387e536f2fb9/volumes" Jan 14 01:32:30.878125 systemd-networkd[1567]: cali446cfd718fe: Link UP Jan 14 01:32:30.878665 systemd-networkd[1567]: cali446cfd718fe: Gained carrier Jan 14 01:32:30.890233 containerd[1684]: 2026-01-14 01:32:30.799 [INFO][4204] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 14 01:32:30.890233 containerd[1684]: 2026-01-14 01:32:30.810 [INFO][4204] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4578--0--0--p--557efd55ff-k8s-calico--apiserver--5f6c79b459--qjrqf-eth0 calico-apiserver-5f6c79b459- calico-apiserver a4130f46-1c6e-474c-9a1c-5fd1820934c0 799 0 2026-01-14 01:32:01 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5f6c79b459 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4578-0-0-p-557efd55ff calico-apiserver-5f6c79b459-qjrqf eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali446cfd718fe [] [] }} ContainerID="ec565caa14faf3143ebea5b85d82d4c357e0b3ecb882f06a7e2b71060dae815d" Namespace="calico-apiserver" Pod="calico-apiserver-5f6c79b459-qjrqf" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-calico--apiserver--5f6c79b459--qjrqf-" Jan 14 01:32:30.890233 containerd[1684]: 2026-01-14 01:32:30.810 [INFO][4204] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="ec565caa14faf3143ebea5b85d82d4c357e0b3ecb882f06a7e2b71060dae815d" Namespace="calico-apiserver" Pod="calico-apiserver-5f6c79b459-qjrqf" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-calico--apiserver--5f6c79b459--qjrqf-eth0" Jan 14 01:32:30.890233 containerd[1684]: 2026-01-14 01:32:30.842 [INFO][4216] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ec565caa14faf3143ebea5b85d82d4c357e0b3ecb882f06a7e2b71060dae815d" HandleID="k8s-pod-network.ec565caa14faf3143ebea5b85d82d4c357e0b3ecb882f06a7e2b71060dae815d" Workload="ci--4578--0--0--p--557efd55ff-k8s-calico--apiserver--5f6c79b459--qjrqf-eth0" Jan 14 01:32:30.890424 containerd[1684]: 2026-01-14 01:32:30.843 [INFO][4216] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="ec565caa14faf3143ebea5b85d82d4c357e0b3ecb882f06a7e2b71060dae815d" HandleID="k8s-pod-network.ec565caa14faf3143ebea5b85d82d4c357e0b3ecb882f06a7e2b71060dae815d" Workload="ci--4578--0--0--p--557efd55ff-k8s-calico--apiserver--5f6c79b459--qjrqf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f020), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4578-0-0-p-557efd55ff", "pod":"calico-apiserver-5f6c79b459-qjrqf", "timestamp":"2026-01-14 01:32:30.842986729 +0000 UTC"}, Hostname:"ci-4578-0-0-p-557efd55ff", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 14 01:32:30.890424 containerd[1684]: 2026-01-14 01:32:30.843 [INFO][4216] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 14 01:32:30.890424 containerd[1684]: 2026-01-14 01:32:30.843 [INFO][4216] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 14 01:32:30.890424 containerd[1684]: 2026-01-14 01:32:30.843 [INFO][4216] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4578-0-0-p-557efd55ff' Jan 14 01:32:30.890424 containerd[1684]: 2026-01-14 01:32:30.849 [INFO][4216] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.ec565caa14faf3143ebea5b85d82d4c357e0b3ecb882f06a7e2b71060dae815d" host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:30.890424 containerd[1684]: 2026-01-14 01:32:30.854 [INFO][4216] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:30.890424 containerd[1684]: 2026-01-14 01:32:30.859 [INFO][4216] ipam/ipam.go 511: Trying affinity for 192.168.118.128/26 host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:30.890424 containerd[1684]: 2026-01-14 01:32:30.861 [INFO][4216] ipam/ipam.go 158: Attempting to load block cidr=192.168.118.128/26 host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:30.890424 containerd[1684]: 2026-01-14 01:32:30.863 [INFO][4216] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.118.128/26 host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:30.890611 containerd[1684]: 2026-01-14 01:32:30.863 [INFO][4216] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.118.128/26 handle="k8s-pod-network.ec565caa14faf3143ebea5b85d82d4c357e0b3ecb882f06a7e2b71060dae815d" host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:30.890611 containerd[1684]: 2026-01-14 01:32:30.864 [INFO][4216] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.ec565caa14faf3143ebea5b85d82d4c357e0b3ecb882f06a7e2b71060dae815d Jan 14 01:32:30.890611 containerd[1684]: 2026-01-14 01:32:30.868 [INFO][4216] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.118.128/26 handle="k8s-pod-network.ec565caa14faf3143ebea5b85d82d4c357e0b3ecb882f06a7e2b71060dae815d" host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:30.890611 containerd[1684]: 2026-01-14 01:32:30.873 [INFO][4216] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.118.130/26] block=192.168.118.128/26 handle="k8s-pod-network.ec565caa14faf3143ebea5b85d82d4c357e0b3ecb882f06a7e2b71060dae815d" host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:30.890611 containerd[1684]: 2026-01-14 01:32:30.873 [INFO][4216] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.118.130/26] handle="k8s-pod-network.ec565caa14faf3143ebea5b85d82d4c357e0b3ecb882f06a7e2b71060dae815d" host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:30.890611 containerd[1684]: 2026-01-14 01:32:30.873 [INFO][4216] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 14 01:32:30.890611 containerd[1684]: 2026-01-14 01:32:30.873 [INFO][4216] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.118.130/26] IPv6=[] ContainerID="ec565caa14faf3143ebea5b85d82d4c357e0b3ecb882f06a7e2b71060dae815d" HandleID="k8s-pod-network.ec565caa14faf3143ebea5b85d82d4c357e0b3ecb882f06a7e2b71060dae815d" Workload="ci--4578--0--0--p--557efd55ff-k8s-calico--apiserver--5f6c79b459--qjrqf-eth0" Jan 14 01:32:30.890781 containerd[1684]: 2026-01-14 01:32:30.875 [INFO][4204] cni-plugin/k8s.go 418: Populated endpoint ContainerID="ec565caa14faf3143ebea5b85d82d4c357e0b3ecb882f06a7e2b71060dae815d" Namespace="calico-apiserver" Pod="calico-apiserver-5f6c79b459-qjrqf" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-calico--apiserver--5f6c79b459--qjrqf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4578--0--0--p--557efd55ff-k8s-calico--apiserver--5f6c79b459--qjrqf-eth0", GenerateName:"calico-apiserver-5f6c79b459-", Namespace:"calico-apiserver", SelfLink:"", UID:"a4130f46-1c6e-474c-9a1c-5fd1820934c0", ResourceVersion:"799", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 32, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5f6c79b459", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4578-0-0-p-557efd55ff", ContainerID:"", Pod:"calico-apiserver-5f6c79b459-qjrqf", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.118.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali446cfd718fe", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:32:30.890836 containerd[1684]: 2026-01-14 01:32:30.875 [INFO][4204] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.118.130/32] ContainerID="ec565caa14faf3143ebea5b85d82d4c357e0b3ecb882f06a7e2b71060dae815d" Namespace="calico-apiserver" Pod="calico-apiserver-5f6c79b459-qjrqf" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-calico--apiserver--5f6c79b459--qjrqf-eth0" Jan 14 01:32:30.890836 containerd[1684]: 2026-01-14 01:32:30.875 [INFO][4204] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali446cfd718fe ContainerID="ec565caa14faf3143ebea5b85d82d4c357e0b3ecb882f06a7e2b71060dae815d" Namespace="calico-apiserver" Pod="calico-apiserver-5f6c79b459-qjrqf" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-calico--apiserver--5f6c79b459--qjrqf-eth0" Jan 14 01:32:30.890836 containerd[1684]: 2026-01-14 01:32:30.879 [INFO][4204] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ec565caa14faf3143ebea5b85d82d4c357e0b3ecb882f06a7e2b71060dae815d" Namespace="calico-apiserver" Pod="calico-apiserver-5f6c79b459-qjrqf" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-calico--apiserver--5f6c79b459--qjrqf-eth0" Jan 14 01:32:30.890918 containerd[1684]: 2026-01-14 01:32:30.879 [INFO][4204] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="ec565caa14faf3143ebea5b85d82d4c357e0b3ecb882f06a7e2b71060dae815d" Namespace="calico-apiserver" Pod="calico-apiserver-5f6c79b459-qjrqf" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-calico--apiserver--5f6c79b459--qjrqf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4578--0--0--p--557efd55ff-k8s-calico--apiserver--5f6c79b459--qjrqf-eth0", GenerateName:"calico-apiserver-5f6c79b459-", Namespace:"calico-apiserver", SelfLink:"", UID:"a4130f46-1c6e-474c-9a1c-5fd1820934c0", ResourceVersion:"799", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 32, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5f6c79b459", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4578-0-0-p-557efd55ff", ContainerID:"ec565caa14faf3143ebea5b85d82d4c357e0b3ecb882f06a7e2b71060dae815d", Pod:"calico-apiserver-5f6c79b459-qjrqf", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.118.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali446cfd718fe", MAC:"9a:68:8c:cc:1e:38", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:32:30.890966 containerd[1684]: 2026-01-14 01:32:30.888 [INFO][4204] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="ec565caa14faf3143ebea5b85d82d4c357e0b3ecb882f06a7e2b71060dae815d" Namespace="calico-apiserver" Pod="calico-apiserver-5f6c79b459-qjrqf" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-calico--apiserver--5f6c79b459--qjrqf-eth0" Jan 14 01:32:30.906556 kubelet[2924]: E0114 01:32:30.906090 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-fcdf56664-kjljc" podUID="5694b728-96e4-405e-ad55-bbb10255a07e" Jan 14 01:32:30.929237 containerd[1684]: time="2026-01-14T01:32:30.929179239Z" level=info msg="connecting to shim ec565caa14faf3143ebea5b85d82d4c357e0b3ecb882f06a7e2b71060dae815d" address="unix:///run/containerd/s/6053363489de56902031a08f9c938d64dfeae8eedf58889de5346bf7972c7abe" namespace=k8s.io protocol=ttrpc version=3 Jan 14 01:32:30.939000 audit[4249]: NETFILTER_CFG table=filter:117 family=2 entries=22 op=nft_register_rule pid=4249 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:32:30.939000 audit[4249]: SYSCALL arch=c000003e syscall=46 success=yes exit=8224 a0=3 a1=7ffc0249e230 a2=0 a3=7ffc0249e21c items=0 ppid=3028 pid=4249 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:30.939000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:32:30.943000 audit[4249]: NETFILTER_CFG table=nat:118 family=2 entries=12 op=nft_register_rule pid=4249 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:32:30.943000 audit[4249]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffc0249e230 a2=0 a3=0 items=0 ppid=3028 pid=4249 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:30.943000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:32:30.959041 systemd[1]: Started cri-containerd-ec565caa14faf3143ebea5b85d82d4c357e0b3ecb882f06a7e2b71060dae815d.scope - libcontainer container ec565caa14faf3143ebea5b85d82d4c357e0b3ecb882f06a7e2b71060dae815d. Jan 14 01:32:30.966000 audit: BPF prog-id=180 op=LOAD Jan 14 01:32:30.966000 audit: BPF prog-id=181 op=LOAD Jan 14 01:32:30.966000 audit[4248]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=4235 pid=4248 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:30.966000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6563353635636161313466616633313433656265613562383564383264 Jan 14 01:32:30.966000 audit: BPF prog-id=181 op=UNLOAD Jan 14 01:32:30.966000 audit[4248]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4235 pid=4248 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:30.966000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6563353635636161313466616633313433656265613562383564383264 Jan 14 01:32:30.966000 audit: BPF prog-id=182 op=LOAD Jan 14 01:32:30.966000 audit[4248]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=4235 pid=4248 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:30.966000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6563353635636161313466616633313433656265613562383564383264 Jan 14 01:32:30.966000 audit: BPF prog-id=183 op=LOAD Jan 14 01:32:30.966000 audit[4248]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=4235 pid=4248 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:30.966000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6563353635636161313466616633313433656265613562383564383264 Jan 14 01:32:30.966000 audit: BPF prog-id=183 op=UNLOAD Jan 14 01:32:30.966000 audit[4248]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4235 pid=4248 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:30.966000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6563353635636161313466616633313433656265613562383564383264 Jan 14 01:32:30.966000 audit: BPF prog-id=182 op=UNLOAD Jan 14 01:32:30.966000 audit[4248]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4235 pid=4248 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:30.966000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6563353635636161313466616633313433656265613562383564383264 Jan 14 01:32:30.966000 audit: BPF prog-id=184 op=LOAD Jan 14 01:32:30.966000 audit[4248]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=4235 pid=4248 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:30.966000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6563353635636161313466616633313433656265613562383564383264 Jan 14 01:32:31.031454 containerd[1684]: time="2026-01-14T01:32:31.031363026Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5f6c79b459-qjrqf,Uid:a4130f46-1c6e-474c-9a1c-5fd1820934c0,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"ec565caa14faf3143ebea5b85d82d4c357e0b3ecb882f06a7e2b71060dae815d\"" Jan 14 01:32:31.032676 containerd[1684]: time="2026-01-14T01:32:31.032652991Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 14 01:32:31.374202 containerd[1684]: time="2026-01-14T01:32:31.374069718Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:32:31.376043 containerd[1684]: time="2026-01-14T01:32:31.375933659Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 14 01:32:31.376043 containerd[1684]: time="2026-01-14T01:32:31.375993656Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 14 01:32:31.376493 kubelet[2924]: E0114 01:32:31.376406 2924 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:32:31.376574 kubelet[2924]: E0114 01:32:31.376520 2924 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:32:31.376966 kubelet[2924]: E0114 01:32:31.376820 2924 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5jdx6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-5f6c79b459-qjrqf_calico-apiserver(a4130f46-1c6e-474c-9a1c-5fd1820934c0): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 14 01:32:31.378367 kubelet[2924]: E0114 01:32:31.378325 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-qjrqf" podUID="a4130f46-1c6e-474c-9a1c-5fd1820934c0" Jan 14 01:32:31.716092 systemd-networkd[1567]: cali95f41f79b8e: Gained IPv6LL Jan 14 01:32:31.769871 containerd[1684]: time="2026-01-14T01:32:31.769375531Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5f6c79b459-7rtbs,Uid:327b5d4f-3ff4-43f6-98c0-8969f3f3f2d2,Namespace:calico-apiserver,Attempt:0,}" Jan 14 01:32:31.899912 systemd-networkd[1567]: cali5e6df3eeac9: Link UP Jan 14 01:32:31.900414 systemd-networkd[1567]: cali5e6df3eeac9: Gained carrier Jan 14 01:32:31.911558 kubelet[2924]: E0114 01:32:31.911524 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-qjrqf" podUID="a4130f46-1c6e-474c-9a1c-5fd1820934c0" Jan 14 01:32:31.912433 kubelet[2924]: E0114 01:32:31.912376 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-fcdf56664-kjljc" podUID="5694b728-96e4-405e-ad55-bbb10255a07e" Jan 14 01:32:31.924424 containerd[1684]: 2026-01-14 01:32:31.816 [INFO][4290] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 14 01:32:31.924424 containerd[1684]: 2026-01-14 01:32:31.830 [INFO][4290] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4578--0--0--p--557efd55ff-k8s-calico--apiserver--5f6c79b459--7rtbs-eth0 calico-apiserver-5f6c79b459- calico-apiserver 327b5d4f-3ff4-43f6-98c0-8969f3f3f2d2 796 0 2026-01-14 01:32:01 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5f6c79b459 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4578-0-0-p-557efd55ff calico-apiserver-5f6c79b459-7rtbs eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali5e6df3eeac9 [] [] }} ContainerID="f2de219dd6b052fc74f5af64507090ceafc468fe93b70bbd742d2052b3c37e9c" Namespace="calico-apiserver" Pod="calico-apiserver-5f6c79b459-7rtbs" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-calico--apiserver--5f6c79b459--7rtbs-" Jan 14 01:32:31.924424 containerd[1684]: 2026-01-14 01:32:31.830 [INFO][4290] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f2de219dd6b052fc74f5af64507090ceafc468fe93b70bbd742d2052b3c37e9c" Namespace="calico-apiserver" Pod="calico-apiserver-5f6c79b459-7rtbs" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-calico--apiserver--5f6c79b459--7rtbs-eth0" Jan 14 01:32:31.924424 containerd[1684]: 2026-01-14 01:32:31.864 [INFO][4307] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f2de219dd6b052fc74f5af64507090ceafc468fe93b70bbd742d2052b3c37e9c" HandleID="k8s-pod-network.f2de219dd6b052fc74f5af64507090ceafc468fe93b70bbd742d2052b3c37e9c" Workload="ci--4578--0--0--p--557efd55ff-k8s-calico--apiserver--5f6c79b459--7rtbs-eth0" Jan 14 01:32:31.924615 containerd[1684]: 2026-01-14 01:32:31.865 [INFO][4307] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="f2de219dd6b052fc74f5af64507090ceafc468fe93b70bbd742d2052b3c37e9c" HandleID="k8s-pod-network.f2de219dd6b052fc74f5af64507090ceafc468fe93b70bbd742d2052b3c37e9c" Workload="ci--4578--0--0--p--557efd55ff-k8s-calico--apiserver--5f6c79b459--7rtbs-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d58c0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4578-0-0-p-557efd55ff", "pod":"calico-apiserver-5f6c79b459-7rtbs", "timestamp":"2026-01-14 01:32:31.864895812 +0000 UTC"}, Hostname:"ci-4578-0-0-p-557efd55ff", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 14 01:32:31.924615 containerd[1684]: 2026-01-14 01:32:31.865 [INFO][4307] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 14 01:32:31.924615 containerd[1684]: 2026-01-14 01:32:31.865 [INFO][4307] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 14 01:32:31.924615 containerd[1684]: 2026-01-14 01:32:31.865 [INFO][4307] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4578-0-0-p-557efd55ff' Jan 14 01:32:31.924615 containerd[1684]: 2026-01-14 01:32:31.871 [INFO][4307] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.f2de219dd6b052fc74f5af64507090ceafc468fe93b70bbd742d2052b3c37e9c" host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:31.924615 containerd[1684]: 2026-01-14 01:32:31.875 [INFO][4307] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:31.924615 containerd[1684]: 2026-01-14 01:32:31.879 [INFO][4307] ipam/ipam.go 511: Trying affinity for 192.168.118.128/26 host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:31.924615 containerd[1684]: 2026-01-14 01:32:31.881 [INFO][4307] ipam/ipam.go 158: Attempting to load block cidr=192.168.118.128/26 host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:31.924615 containerd[1684]: 2026-01-14 01:32:31.883 [INFO][4307] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.118.128/26 host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:31.925202 containerd[1684]: 2026-01-14 01:32:31.883 [INFO][4307] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.118.128/26 handle="k8s-pod-network.f2de219dd6b052fc74f5af64507090ceafc468fe93b70bbd742d2052b3c37e9c" host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:31.925202 containerd[1684]: 2026-01-14 01:32:31.884 [INFO][4307] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.f2de219dd6b052fc74f5af64507090ceafc468fe93b70bbd742d2052b3c37e9c Jan 14 01:32:31.925202 containerd[1684]: 2026-01-14 01:32:31.887 [INFO][4307] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.118.128/26 handle="k8s-pod-network.f2de219dd6b052fc74f5af64507090ceafc468fe93b70bbd742d2052b3c37e9c" host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:31.925202 containerd[1684]: 2026-01-14 01:32:31.893 [INFO][4307] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.118.131/26] block=192.168.118.128/26 handle="k8s-pod-network.f2de219dd6b052fc74f5af64507090ceafc468fe93b70bbd742d2052b3c37e9c" host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:31.925202 containerd[1684]: 2026-01-14 01:32:31.894 [INFO][4307] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.118.131/26] handle="k8s-pod-network.f2de219dd6b052fc74f5af64507090ceafc468fe93b70bbd742d2052b3c37e9c" host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:31.925202 containerd[1684]: 2026-01-14 01:32:31.894 [INFO][4307] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 14 01:32:31.925202 containerd[1684]: 2026-01-14 01:32:31.894 [INFO][4307] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.118.131/26] IPv6=[] ContainerID="f2de219dd6b052fc74f5af64507090ceafc468fe93b70bbd742d2052b3c37e9c" HandleID="k8s-pod-network.f2de219dd6b052fc74f5af64507090ceafc468fe93b70bbd742d2052b3c37e9c" Workload="ci--4578--0--0--p--557efd55ff-k8s-calico--apiserver--5f6c79b459--7rtbs-eth0" Jan 14 01:32:31.925397 containerd[1684]: 2026-01-14 01:32:31.895 [INFO][4290] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f2de219dd6b052fc74f5af64507090ceafc468fe93b70bbd742d2052b3c37e9c" Namespace="calico-apiserver" Pod="calico-apiserver-5f6c79b459-7rtbs" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-calico--apiserver--5f6c79b459--7rtbs-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4578--0--0--p--557efd55ff-k8s-calico--apiserver--5f6c79b459--7rtbs-eth0", GenerateName:"calico-apiserver-5f6c79b459-", Namespace:"calico-apiserver", SelfLink:"", UID:"327b5d4f-3ff4-43f6-98c0-8969f3f3f2d2", ResourceVersion:"796", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 32, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5f6c79b459", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4578-0-0-p-557efd55ff", ContainerID:"", Pod:"calico-apiserver-5f6c79b459-7rtbs", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.118.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali5e6df3eeac9", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:32:31.925469 containerd[1684]: 2026-01-14 01:32:31.895 [INFO][4290] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.118.131/32] ContainerID="f2de219dd6b052fc74f5af64507090ceafc468fe93b70bbd742d2052b3c37e9c" Namespace="calico-apiserver" Pod="calico-apiserver-5f6c79b459-7rtbs" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-calico--apiserver--5f6c79b459--7rtbs-eth0" Jan 14 01:32:31.925469 containerd[1684]: 2026-01-14 01:32:31.895 [INFO][4290] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5e6df3eeac9 ContainerID="f2de219dd6b052fc74f5af64507090ceafc468fe93b70bbd742d2052b3c37e9c" Namespace="calico-apiserver" Pod="calico-apiserver-5f6c79b459-7rtbs" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-calico--apiserver--5f6c79b459--7rtbs-eth0" Jan 14 01:32:31.925469 containerd[1684]: 2026-01-14 01:32:31.900 [INFO][4290] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f2de219dd6b052fc74f5af64507090ceafc468fe93b70bbd742d2052b3c37e9c" Namespace="calico-apiserver" Pod="calico-apiserver-5f6c79b459-7rtbs" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-calico--apiserver--5f6c79b459--7rtbs-eth0" Jan 14 01:32:31.925537 containerd[1684]: 2026-01-14 01:32:31.900 [INFO][4290] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f2de219dd6b052fc74f5af64507090ceafc468fe93b70bbd742d2052b3c37e9c" Namespace="calico-apiserver" Pod="calico-apiserver-5f6c79b459-7rtbs" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-calico--apiserver--5f6c79b459--7rtbs-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4578--0--0--p--557efd55ff-k8s-calico--apiserver--5f6c79b459--7rtbs-eth0", GenerateName:"calico-apiserver-5f6c79b459-", Namespace:"calico-apiserver", SelfLink:"", UID:"327b5d4f-3ff4-43f6-98c0-8969f3f3f2d2", ResourceVersion:"796", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 32, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5f6c79b459", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4578-0-0-p-557efd55ff", ContainerID:"f2de219dd6b052fc74f5af64507090ceafc468fe93b70bbd742d2052b3c37e9c", Pod:"calico-apiserver-5f6c79b459-7rtbs", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.118.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali5e6df3eeac9", MAC:"e2:35:5b:22:ea:9e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:32:31.925599 containerd[1684]: 2026-01-14 01:32:31.922 [INFO][4290] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f2de219dd6b052fc74f5af64507090ceafc468fe93b70bbd742d2052b3c37e9c" Namespace="calico-apiserver" Pod="calico-apiserver-5f6c79b459-7rtbs" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-calico--apiserver--5f6c79b459--7rtbs-eth0" Jan 14 01:32:31.965386 containerd[1684]: time="2026-01-14T01:32:31.965279293Z" level=info msg="connecting to shim f2de219dd6b052fc74f5af64507090ceafc468fe93b70bbd742d2052b3c37e9c" address="unix:///run/containerd/s/d09e4d41fdc51091246682f8c8868c5a05e9d289ae752e1b0ed6e6224dad8e92" namespace=k8s.io protocol=ttrpc version=3 Jan 14 01:32:31.966000 audit[4327]: NETFILTER_CFG table=filter:119 family=2 entries=22 op=nft_register_rule pid=4327 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:32:31.966000 audit[4327]: SYSCALL arch=c000003e syscall=46 success=yes exit=8224 a0=3 a1=7ffd71842230 a2=0 a3=7ffd7184221c items=0 ppid=3028 pid=4327 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:31.966000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:32:31.973000 audit[4327]: NETFILTER_CFG table=nat:120 family=2 entries=12 op=nft_register_rule pid=4327 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:32:31.973000 audit[4327]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffd71842230 a2=0 a3=0 items=0 ppid=3028 pid=4327 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:31.973000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:32:31.993010 systemd[1]: Started cri-containerd-f2de219dd6b052fc74f5af64507090ceafc468fe93b70bbd742d2052b3c37e9c.scope - libcontainer container f2de219dd6b052fc74f5af64507090ceafc468fe93b70bbd742d2052b3c37e9c. Jan 14 01:32:32.000000 audit: BPF prog-id=185 op=LOAD Jan 14 01:32:32.001000 audit: BPF prog-id=186 op=LOAD Jan 14 01:32:32.001000 audit[4342]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=4331 pid=4342 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:32.001000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632646532313964643662303532666337346635616636343530373039 Jan 14 01:32:32.001000 audit: BPF prog-id=186 op=UNLOAD Jan 14 01:32:32.001000 audit[4342]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4331 pid=4342 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:32.001000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632646532313964643662303532666337346635616636343530373039 Jan 14 01:32:32.001000 audit: BPF prog-id=187 op=LOAD Jan 14 01:32:32.001000 audit[4342]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=4331 pid=4342 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:32.001000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632646532313964643662303532666337346635616636343530373039 Jan 14 01:32:32.001000 audit: BPF prog-id=188 op=LOAD Jan 14 01:32:32.001000 audit[4342]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=4331 pid=4342 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:32.001000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632646532313964643662303532666337346635616636343530373039 Jan 14 01:32:32.001000 audit: BPF prog-id=188 op=UNLOAD Jan 14 01:32:32.001000 audit[4342]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4331 pid=4342 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:32.001000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632646532313964643662303532666337346635616636343530373039 Jan 14 01:32:32.001000 audit: BPF prog-id=187 op=UNLOAD Jan 14 01:32:32.001000 audit[4342]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4331 pid=4342 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:32.001000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632646532313964643662303532666337346635616636343530373039 Jan 14 01:32:32.001000 audit: BPF prog-id=189 op=LOAD Jan 14 01:32:32.001000 audit[4342]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=4331 pid=4342 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:32.001000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632646532313964643662303532666337346635616636343530373039 Jan 14 01:32:32.036933 containerd[1684]: time="2026-01-14T01:32:32.036824236Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5f6c79b459-7rtbs,Uid:327b5d4f-3ff4-43f6-98c0-8969f3f3f2d2,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"f2de219dd6b052fc74f5af64507090ceafc468fe93b70bbd742d2052b3c37e9c\"" Jan 14 01:32:32.038748 containerd[1684]: time="2026-01-14T01:32:32.038193038Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 14 01:32:32.380599 containerd[1684]: time="2026-01-14T01:32:32.380526031Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:32:32.384705 containerd[1684]: time="2026-01-14T01:32:32.384604623Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 14 01:32:32.385869 containerd[1684]: time="2026-01-14T01:32:32.384624801Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 14 01:32:32.386371 kubelet[2924]: E0114 01:32:32.386323 2924 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:32:32.386456 kubelet[2924]: E0114 01:32:32.386376 2924 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:32:32.388680 kubelet[2924]: E0114 01:32:32.388617 2924 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fmnkd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-5f6c79b459-7rtbs_calico-apiserver(327b5d4f-3ff4-43f6-98c0-8969f3f3f2d2): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 14 01:32:32.390126 kubelet[2924]: E0114 01:32:32.390064 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-7rtbs" podUID="327b5d4f-3ff4-43f6-98c0-8969f3f3f2d2" Jan 14 01:32:32.740307 systemd-networkd[1567]: cali446cfd718fe: Gained IPv6LL Jan 14 01:32:32.769384 containerd[1684]: time="2026-01-14T01:32:32.769317312Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-kbxbb,Uid:c366ef83-2b27-4825-8b26-f67f058bea78,Namespace:kube-system,Attempt:0,}" Jan 14 01:32:32.904232 systemd-networkd[1567]: cali7557b8384f3: Link UP Jan 14 01:32:32.904420 systemd-networkd[1567]: cali7557b8384f3: Gained carrier Jan 14 01:32:32.921016 kubelet[2924]: E0114 01:32:32.920863 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-7rtbs" podUID="327b5d4f-3ff4-43f6-98c0-8969f3f3f2d2" Jan 14 01:32:32.921016 kubelet[2924]: E0114 01:32:32.920917 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-qjrqf" podUID="a4130f46-1c6e-474c-9a1c-5fd1820934c0" Jan 14 01:32:32.922823 containerd[1684]: 2026-01-14 01:32:32.808 [INFO][4370] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 14 01:32:32.922823 containerd[1684]: 2026-01-14 01:32:32.820 [INFO][4370] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4578--0--0--p--557efd55ff-k8s-coredns--668d6bf9bc--kbxbb-eth0 coredns-668d6bf9bc- kube-system c366ef83-2b27-4825-8b26-f67f058bea78 797 0 2026-01-14 01:31:53 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4578-0-0-p-557efd55ff coredns-668d6bf9bc-kbxbb eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali7557b8384f3 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="360b8640bd42108f7b47aba2cfaffe083fdf6f1f478e5eca5e47a53218c2f201" Namespace="kube-system" Pod="coredns-668d6bf9bc-kbxbb" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-coredns--668d6bf9bc--kbxbb-" Jan 14 01:32:32.922823 containerd[1684]: 2026-01-14 01:32:32.822 [INFO][4370] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="360b8640bd42108f7b47aba2cfaffe083fdf6f1f478e5eca5e47a53218c2f201" Namespace="kube-system" Pod="coredns-668d6bf9bc-kbxbb" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-coredns--668d6bf9bc--kbxbb-eth0" Jan 14 01:32:32.922823 containerd[1684]: 2026-01-14 01:32:32.845 [INFO][4381] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="360b8640bd42108f7b47aba2cfaffe083fdf6f1f478e5eca5e47a53218c2f201" HandleID="k8s-pod-network.360b8640bd42108f7b47aba2cfaffe083fdf6f1f478e5eca5e47a53218c2f201" Workload="ci--4578--0--0--p--557efd55ff-k8s-coredns--668d6bf9bc--kbxbb-eth0" Jan 14 01:32:32.923597 containerd[1684]: 2026-01-14 01:32:32.846 [INFO][4381] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="360b8640bd42108f7b47aba2cfaffe083fdf6f1f478e5eca5e47a53218c2f201" HandleID="k8s-pod-network.360b8640bd42108f7b47aba2cfaffe083fdf6f1f478e5eca5e47a53218c2f201" Workload="ci--4578--0--0--p--557efd55ff-k8s-coredns--668d6bf9bc--kbxbb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f0d0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4578-0-0-p-557efd55ff", "pod":"coredns-668d6bf9bc-kbxbb", "timestamp":"2026-01-14 01:32:32.84598557 +0000 UTC"}, Hostname:"ci-4578-0-0-p-557efd55ff", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 14 01:32:32.923597 containerd[1684]: 2026-01-14 01:32:32.846 [INFO][4381] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 14 01:32:32.923597 containerd[1684]: 2026-01-14 01:32:32.846 [INFO][4381] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 14 01:32:32.923597 containerd[1684]: 2026-01-14 01:32:32.846 [INFO][4381] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4578-0-0-p-557efd55ff' Jan 14 01:32:32.923597 containerd[1684]: 2026-01-14 01:32:32.868 [INFO][4381] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.360b8640bd42108f7b47aba2cfaffe083fdf6f1f478e5eca5e47a53218c2f201" host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:32.923597 containerd[1684]: 2026-01-14 01:32:32.873 [INFO][4381] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:32.923597 containerd[1684]: 2026-01-14 01:32:32.878 [INFO][4381] ipam/ipam.go 511: Trying affinity for 192.168.118.128/26 host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:32.923597 containerd[1684]: 2026-01-14 01:32:32.880 [INFO][4381] ipam/ipam.go 158: Attempting to load block cidr=192.168.118.128/26 host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:32.923597 containerd[1684]: 2026-01-14 01:32:32.882 [INFO][4381] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.118.128/26 host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:32.923779 containerd[1684]: 2026-01-14 01:32:32.883 [INFO][4381] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.118.128/26 handle="k8s-pod-network.360b8640bd42108f7b47aba2cfaffe083fdf6f1f478e5eca5e47a53218c2f201" host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:32.923779 containerd[1684]: 2026-01-14 01:32:32.885 [INFO][4381] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.360b8640bd42108f7b47aba2cfaffe083fdf6f1f478e5eca5e47a53218c2f201 Jan 14 01:32:32.923779 containerd[1684]: 2026-01-14 01:32:32.888 [INFO][4381] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.118.128/26 handle="k8s-pod-network.360b8640bd42108f7b47aba2cfaffe083fdf6f1f478e5eca5e47a53218c2f201" host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:32.923779 containerd[1684]: 2026-01-14 01:32:32.895 [INFO][4381] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.118.132/26] block=192.168.118.128/26 handle="k8s-pod-network.360b8640bd42108f7b47aba2cfaffe083fdf6f1f478e5eca5e47a53218c2f201" host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:32.923779 containerd[1684]: 2026-01-14 01:32:32.895 [INFO][4381] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.118.132/26] handle="k8s-pod-network.360b8640bd42108f7b47aba2cfaffe083fdf6f1f478e5eca5e47a53218c2f201" host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:32.923779 containerd[1684]: 2026-01-14 01:32:32.895 [INFO][4381] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 14 01:32:32.923779 containerd[1684]: 2026-01-14 01:32:32.895 [INFO][4381] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.118.132/26] IPv6=[] ContainerID="360b8640bd42108f7b47aba2cfaffe083fdf6f1f478e5eca5e47a53218c2f201" HandleID="k8s-pod-network.360b8640bd42108f7b47aba2cfaffe083fdf6f1f478e5eca5e47a53218c2f201" Workload="ci--4578--0--0--p--557efd55ff-k8s-coredns--668d6bf9bc--kbxbb-eth0" Jan 14 01:32:32.923923 containerd[1684]: 2026-01-14 01:32:32.899 [INFO][4370] cni-plugin/k8s.go 418: Populated endpoint ContainerID="360b8640bd42108f7b47aba2cfaffe083fdf6f1f478e5eca5e47a53218c2f201" Namespace="kube-system" Pod="coredns-668d6bf9bc-kbxbb" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-coredns--668d6bf9bc--kbxbb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4578--0--0--p--557efd55ff-k8s-coredns--668d6bf9bc--kbxbb-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"c366ef83-2b27-4825-8b26-f67f058bea78", ResourceVersion:"797", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 31, 53, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4578-0-0-p-557efd55ff", ContainerID:"", Pod:"coredns-668d6bf9bc-kbxbb", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.118.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali7557b8384f3", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:32:32.923923 containerd[1684]: 2026-01-14 01:32:32.899 [INFO][4370] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.118.132/32] ContainerID="360b8640bd42108f7b47aba2cfaffe083fdf6f1f478e5eca5e47a53218c2f201" Namespace="kube-system" Pod="coredns-668d6bf9bc-kbxbb" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-coredns--668d6bf9bc--kbxbb-eth0" Jan 14 01:32:32.923923 containerd[1684]: 2026-01-14 01:32:32.899 [INFO][4370] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7557b8384f3 ContainerID="360b8640bd42108f7b47aba2cfaffe083fdf6f1f478e5eca5e47a53218c2f201" Namespace="kube-system" Pod="coredns-668d6bf9bc-kbxbb" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-coredns--668d6bf9bc--kbxbb-eth0" Jan 14 01:32:32.923923 containerd[1684]: 2026-01-14 01:32:32.904 [INFO][4370] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="360b8640bd42108f7b47aba2cfaffe083fdf6f1f478e5eca5e47a53218c2f201" Namespace="kube-system" Pod="coredns-668d6bf9bc-kbxbb" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-coredns--668d6bf9bc--kbxbb-eth0" Jan 14 01:32:32.923923 containerd[1684]: 2026-01-14 01:32:32.904 [INFO][4370] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="360b8640bd42108f7b47aba2cfaffe083fdf6f1f478e5eca5e47a53218c2f201" Namespace="kube-system" Pod="coredns-668d6bf9bc-kbxbb" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-coredns--668d6bf9bc--kbxbb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4578--0--0--p--557efd55ff-k8s-coredns--668d6bf9bc--kbxbb-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"c366ef83-2b27-4825-8b26-f67f058bea78", ResourceVersion:"797", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 31, 53, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4578-0-0-p-557efd55ff", ContainerID:"360b8640bd42108f7b47aba2cfaffe083fdf6f1f478e5eca5e47a53218c2f201", Pod:"coredns-668d6bf9bc-kbxbb", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.118.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali7557b8384f3", MAC:"ca:ac:54:97:22:7f", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:32:32.923923 containerd[1684]: 2026-01-14 01:32:32.918 [INFO][4370] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="360b8640bd42108f7b47aba2cfaffe083fdf6f1f478e5eca5e47a53218c2f201" Namespace="kube-system" Pod="coredns-668d6bf9bc-kbxbb" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-coredns--668d6bf9bc--kbxbb-eth0" Jan 14 01:32:32.968000 audit[4411]: NETFILTER_CFG table=filter:121 family=2 entries=22 op=nft_register_rule pid=4411 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:32:32.975827 kernel: kauditd_printk_skb: 83 callbacks suppressed Jan 14 01:32:32.977366 kernel: audit: type=1325 audit(1768354352.968:603): table=filter:121 family=2 entries=22 op=nft_register_rule pid=4411 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:32:32.977390 kernel: audit: type=1300 audit(1768354352.968:603): arch=c000003e syscall=46 success=yes exit=8224 a0=3 a1=7ffc38b06f90 a2=0 a3=7ffc38b06f7c items=0 ppid=3028 pid=4411 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:32.968000 audit[4411]: SYSCALL arch=c000003e syscall=46 success=yes exit=8224 a0=3 a1=7ffc38b06f90 a2=0 a3=7ffc38b06f7c items=0 ppid=3028 pid=4411 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:32.968000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:32:32.980868 kernel: audit: type=1327 audit(1768354352.968:603): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:32:32.979000 audit[4411]: NETFILTER_CFG table=nat:122 family=2 entries=12 op=nft_register_rule pid=4411 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:32:32.979000 audit[4411]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffc38b06f90 a2=0 a3=0 items=0 ppid=3028 pid=4411 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:32.987672 kernel: audit: type=1325 audit(1768354352.979:604): table=nat:122 family=2 entries=12 op=nft_register_rule pid=4411 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:32:32.987715 kernel: audit: type=1300 audit(1768354352.979:604): arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffc38b06f90 a2=0 a3=0 items=0 ppid=3028 pid=4411 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:32.992672 containerd[1684]: time="2026-01-14T01:32:32.992444476Z" level=info msg="connecting to shim 360b8640bd42108f7b47aba2cfaffe083fdf6f1f478e5eca5e47a53218c2f201" address="unix:///run/containerd/s/26554a54f0f5f74939f0ffcba88a8195e47942447a4449825b132982db048882" namespace=k8s.io protocol=ttrpc version=3 Jan 14 01:32:32.979000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:32:32.997961 kernel: audit: type=1327 audit(1768354352.979:604): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:32:33.022209 systemd[1]: Started cri-containerd-360b8640bd42108f7b47aba2cfaffe083fdf6f1f478e5eca5e47a53218c2f201.scope - libcontainer container 360b8640bd42108f7b47aba2cfaffe083fdf6f1f478e5eca5e47a53218c2f201. Jan 14 01:32:33.034000 audit: BPF prog-id=190 op=LOAD Jan 14 01:32:33.038865 kernel: audit: type=1334 audit(1768354353.034:605): prog-id=190 op=LOAD Jan 14 01:32:33.037000 audit: BPF prog-id=191 op=LOAD Jan 14 01:32:33.040868 kernel: audit: type=1334 audit(1768354353.037:606): prog-id=191 op=LOAD Jan 14 01:32:33.037000 audit[4432]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b0238 a2=98 a3=0 items=0 ppid=4421 pid=4432 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:33.044860 kernel: audit: type=1300 audit(1768354353.037:606): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b0238 a2=98 a3=0 items=0 ppid=4421 pid=4432 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:33.037000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3336306238363430626434323130386637623437616261326366616666 Jan 14 01:32:33.051870 kernel: audit: type=1327 audit(1768354353.037:606): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3336306238363430626434323130386637623437616261326366616666 Jan 14 01:32:33.037000 audit: BPF prog-id=191 op=UNLOAD Jan 14 01:32:33.037000 audit[4432]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4421 pid=4432 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:33.037000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3336306238363430626434323130386637623437616261326366616666 Jan 14 01:32:33.037000 audit: BPF prog-id=192 op=LOAD Jan 14 01:32:33.037000 audit[4432]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b0488 a2=98 a3=0 items=0 ppid=4421 pid=4432 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:33.037000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3336306238363430626434323130386637623437616261326366616666 Jan 14 01:32:33.037000 audit: BPF prog-id=193 op=LOAD Jan 14 01:32:33.037000 audit[4432]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001b0218 a2=98 a3=0 items=0 ppid=4421 pid=4432 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:33.037000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3336306238363430626434323130386637623437616261326366616666 Jan 14 01:32:33.037000 audit: BPF prog-id=193 op=UNLOAD Jan 14 01:32:33.037000 audit[4432]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4421 pid=4432 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:33.037000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3336306238363430626434323130386637623437616261326366616666 Jan 14 01:32:33.037000 audit: BPF prog-id=192 op=UNLOAD Jan 14 01:32:33.037000 audit[4432]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4421 pid=4432 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:33.037000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3336306238363430626434323130386637623437616261326366616666 Jan 14 01:32:33.037000 audit: BPF prog-id=194 op=LOAD Jan 14 01:32:33.037000 audit[4432]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b06e8 a2=98 a3=0 items=0 ppid=4421 pid=4432 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:33.037000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3336306238363430626434323130386637623437616261326366616666 Jan 14 01:32:33.094943 containerd[1684]: time="2026-01-14T01:32:33.094902980Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-kbxbb,Uid:c366ef83-2b27-4825-8b26-f67f058bea78,Namespace:kube-system,Attempt:0,} returns sandbox id \"360b8640bd42108f7b47aba2cfaffe083fdf6f1f478e5eca5e47a53218c2f201\"" Jan 14 01:32:33.096831 containerd[1684]: time="2026-01-14T01:32:33.096808251Z" level=info msg="CreateContainer within sandbox \"360b8640bd42108f7b47aba2cfaffe083fdf6f1f478e5eca5e47a53218c2f201\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 14 01:32:33.121054 containerd[1684]: time="2026-01-14T01:32:33.118585917Z" level=info msg="Container 056a93c8f65c81eba837a74feae56b4dae58ef09c72bc2892391fe34aa39768c: CDI devices from CRI Config.CDIDevices: []" Jan 14 01:32:33.128574 containerd[1684]: time="2026-01-14T01:32:33.128537190Z" level=info msg="CreateContainer within sandbox \"360b8640bd42108f7b47aba2cfaffe083fdf6f1f478e5eca5e47a53218c2f201\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"056a93c8f65c81eba837a74feae56b4dae58ef09c72bc2892391fe34aa39768c\"" Jan 14 01:32:33.129652 containerd[1684]: time="2026-01-14T01:32:33.129630193Z" level=info msg="StartContainer for \"056a93c8f65c81eba837a74feae56b4dae58ef09c72bc2892391fe34aa39768c\"" Jan 14 01:32:33.131065 containerd[1684]: time="2026-01-14T01:32:33.131007733Z" level=info msg="connecting to shim 056a93c8f65c81eba837a74feae56b4dae58ef09c72bc2892391fe34aa39768c" address="unix:///run/containerd/s/26554a54f0f5f74939f0ffcba88a8195e47942447a4449825b132982db048882" protocol=ttrpc version=3 Jan 14 01:32:33.154031 systemd[1]: Started cri-containerd-056a93c8f65c81eba837a74feae56b4dae58ef09c72bc2892391fe34aa39768c.scope - libcontainer container 056a93c8f65c81eba837a74feae56b4dae58ef09c72bc2892391fe34aa39768c. Jan 14 01:32:33.163000 audit: BPF prog-id=195 op=LOAD Jan 14 01:32:33.164000 audit: BPF prog-id=196 op=LOAD Jan 14 01:32:33.164000 audit[4458]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=4421 pid=4458 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:33.164000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3035366139336338663635633831656261383337613734666561653536 Jan 14 01:32:33.164000 audit: BPF prog-id=196 op=UNLOAD Jan 14 01:32:33.164000 audit[4458]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4421 pid=4458 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:33.164000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3035366139336338663635633831656261383337613734666561653536 Jan 14 01:32:33.164000 audit: BPF prog-id=197 op=LOAD Jan 14 01:32:33.164000 audit[4458]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=4421 pid=4458 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:33.164000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3035366139336338663635633831656261383337613734666561653536 Jan 14 01:32:33.164000 audit: BPF prog-id=198 op=LOAD Jan 14 01:32:33.164000 audit[4458]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=4421 pid=4458 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:33.164000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3035366139336338663635633831656261383337613734666561653536 Jan 14 01:32:33.164000 audit: BPF prog-id=198 op=UNLOAD Jan 14 01:32:33.164000 audit[4458]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4421 pid=4458 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:33.164000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3035366139336338663635633831656261383337613734666561653536 Jan 14 01:32:33.164000 audit: BPF prog-id=197 op=UNLOAD Jan 14 01:32:33.164000 audit[4458]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4421 pid=4458 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:33.164000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3035366139336338663635633831656261383337613734666561653536 Jan 14 01:32:33.164000 audit: BPF prog-id=199 op=LOAD Jan 14 01:32:33.164000 audit[4458]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=4421 pid=4458 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:33.164000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3035366139336338663635633831656261383337613734666561653536 Jan 14 01:32:33.183632 containerd[1684]: time="2026-01-14T01:32:33.183602590Z" level=info msg="StartContainer for \"056a93c8f65c81eba837a74feae56b4dae58ef09c72bc2892391fe34aa39768c\" returns successfully" Jan 14 01:32:33.508144 systemd-networkd[1567]: cali5e6df3eeac9: Gained IPv6LL Jan 14 01:32:33.769424 containerd[1684]: time="2026-01-14T01:32:33.769175184Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-gknfk,Uid:9e0350f1-074c-4801-8433-6b63afe081c2,Namespace:calico-system,Attempt:0,}" Jan 14 01:32:33.769424 containerd[1684]: time="2026-01-14T01:32:33.769266636Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-lkwn9,Uid:e4543e16-4927-4a0f-aab5-23e9d79b2579,Namespace:kube-system,Attempt:0,}" Jan 14 01:32:33.769997 containerd[1684]: time="2026-01-14T01:32:33.769820849Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-bfb5m,Uid:e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2,Namespace:calico-system,Attempt:0,}" Jan 14 01:32:33.930460 kubelet[2924]: E0114 01:32:33.930172 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-7rtbs" podUID="327b5d4f-3ff4-43f6-98c0-8969f3f3f2d2" Jan 14 01:32:33.962932 kubelet[2924]: I0114 01:32:33.962878 2924 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-kbxbb" podStartSLOduration=40.962808462 podStartE2EDuration="40.962808462s" podCreationTimestamp="2026-01-14 01:31:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-14 01:32:33.957947763 +0000 UTC m=+47.290177013" watchObservedRunningTime="2026-01-14 01:32:33.962808462 +0000 UTC m=+47.295037715" Jan 14 01:32:33.990655 systemd-networkd[1567]: caliee78a12b5d6: Link UP Jan 14 01:32:33.993443 systemd-networkd[1567]: caliee78a12b5d6: Gained carrier Jan 14 01:32:33.996000 audit[4551]: NETFILTER_CFG table=filter:123 family=2 entries=22 op=nft_register_rule pid=4551 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:32:33.996000 audit[4551]: SYSCALL arch=c000003e syscall=46 success=yes exit=8224 a0=3 a1=7fffd2a98780 a2=0 a3=7fffd2a9876c items=0 ppid=3028 pid=4551 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:33.996000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:32:34.001000 audit[4551]: NETFILTER_CFG table=nat:124 family=2 entries=12 op=nft_register_rule pid=4551 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:32:34.001000 audit[4551]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7fffd2a98780 a2=0 a3=0 items=0 ppid=3028 pid=4551 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:34.001000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:32:34.010758 containerd[1684]: 2026-01-14 01:32:33.859 [INFO][4506] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 14 01:32:34.010758 containerd[1684]: 2026-01-14 01:32:33.879 [INFO][4506] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4578--0--0--p--557efd55ff-k8s-csi--node--driver--bfb5m-eth0 csi-node-driver- calico-system e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2 685 0 2026-01-14 01:32:06 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:857b56db8f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4578-0-0-p-557efd55ff csi-node-driver-bfb5m eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] caliee78a12b5d6 [] [] }} ContainerID="ae9de380d71f2ed7cf6dfe48ad3f7eae1a058876fadddca6b5f18d2c7e20a62f" Namespace="calico-system" Pod="csi-node-driver-bfb5m" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-csi--node--driver--bfb5m-" Jan 14 01:32:34.010758 containerd[1684]: 2026-01-14 01:32:33.880 [INFO][4506] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="ae9de380d71f2ed7cf6dfe48ad3f7eae1a058876fadddca6b5f18d2c7e20a62f" Namespace="calico-system" Pod="csi-node-driver-bfb5m" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-csi--node--driver--bfb5m-eth0" Jan 14 01:32:34.010758 containerd[1684]: 2026-01-14 01:32:33.926 [INFO][4535] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ae9de380d71f2ed7cf6dfe48ad3f7eae1a058876fadddca6b5f18d2c7e20a62f" HandleID="k8s-pod-network.ae9de380d71f2ed7cf6dfe48ad3f7eae1a058876fadddca6b5f18d2c7e20a62f" Workload="ci--4578--0--0--p--557efd55ff-k8s-csi--node--driver--bfb5m-eth0" Jan 14 01:32:34.010758 containerd[1684]: 2026-01-14 01:32:33.927 [INFO][4535] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="ae9de380d71f2ed7cf6dfe48ad3f7eae1a058876fadddca6b5f18d2c7e20a62f" HandleID="k8s-pod-network.ae9de380d71f2ed7cf6dfe48ad3f7eae1a058876fadddca6b5f18d2c7e20a62f" Workload="ci--4578--0--0--p--557efd55ff-k8s-csi--node--driver--bfb5m-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f7f0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4578-0-0-p-557efd55ff", "pod":"csi-node-driver-bfb5m", "timestamp":"2026-01-14 01:32:33.92694687 +0000 UTC"}, Hostname:"ci-4578-0-0-p-557efd55ff", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 14 01:32:34.010758 containerd[1684]: 2026-01-14 01:32:33.927 [INFO][4535] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 14 01:32:34.010758 containerd[1684]: 2026-01-14 01:32:33.927 [INFO][4535] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 14 01:32:34.010758 containerd[1684]: 2026-01-14 01:32:33.927 [INFO][4535] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4578-0-0-p-557efd55ff' Jan 14 01:32:34.010758 containerd[1684]: 2026-01-14 01:32:33.940 [INFO][4535] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.ae9de380d71f2ed7cf6dfe48ad3f7eae1a058876fadddca6b5f18d2c7e20a62f" host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:34.010758 containerd[1684]: 2026-01-14 01:32:33.948 [INFO][4535] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:34.010758 containerd[1684]: 2026-01-14 01:32:33.953 [INFO][4535] ipam/ipam.go 511: Trying affinity for 192.168.118.128/26 host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:34.010758 containerd[1684]: 2026-01-14 01:32:33.954 [INFO][4535] ipam/ipam.go 158: Attempting to load block cidr=192.168.118.128/26 host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:34.010758 containerd[1684]: 2026-01-14 01:32:33.959 [INFO][4535] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.118.128/26 host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:34.010758 containerd[1684]: 2026-01-14 01:32:33.959 [INFO][4535] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.118.128/26 handle="k8s-pod-network.ae9de380d71f2ed7cf6dfe48ad3f7eae1a058876fadddca6b5f18d2c7e20a62f" host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:34.010758 containerd[1684]: 2026-01-14 01:32:33.962 [INFO][4535] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.ae9de380d71f2ed7cf6dfe48ad3f7eae1a058876fadddca6b5f18d2c7e20a62f Jan 14 01:32:34.010758 containerd[1684]: 2026-01-14 01:32:33.969 [INFO][4535] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.118.128/26 handle="k8s-pod-network.ae9de380d71f2ed7cf6dfe48ad3f7eae1a058876fadddca6b5f18d2c7e20a62f" host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:34.010758 containerd[1684]: 2026-01-14 01:32:33.978 [INFO][4535] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.118.133/26] block=192.168.118.128/26 handle="k8s-pod-network.ae9de380d71f2ed7cf6dfe48ad3f7eae1a058876fadddca6b5f18d2c7e20a62f" host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:34.010758 containerd[1684]: 2026-01-14 01:32:33.978 [INFO][4535] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.118.133/26] handle="k8s-pod-network.ae9de380d71f2ed7cf6dfe48ad3f7eae1a058876fadddca6b5f18d2c7e20a62f" host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:34.010758 containerd[1684]: 2026-01-14 01:32:33.978 [INFO][4535] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 14 01:32:34.010758 containerd[1684]: 2026-01-14 01:32:33.978 [INFO][4535] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.118.133/26] IPv6=[] ContainerID="ae9de380d71f2ed7cf6dfe48ad3f7eae1a058876fadddca6b5f18d2c7e20a62f" HandleID="k8s-pod-network.ae9de380d71f2ed7cf6dfe48ad3f7eae1a058876fadddca6b5f18d2c7e20a62f" Workload="ci--4578--0--0--p--557efd55ff-k8s-csi--node--driver--bfb5m-eth0" Jan 14 01:32:34.014742 containerd[1684]: 2026-01-14 01:32:33.983 [INFO][4506] cni-plugin/k8s.go 418: Populated endpoint ContainerID="ae9de380d71f2ed7cf6dfe48ad3f7eae1a058876fadddca6b5f18d2c7e20a62f" Namespace="calico-system" Pod="csi-node-driver-bfb5m" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-csi--node--driver--bfb5m-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4578--0--0--p--557efd55ff-k8s-csi--node--driver--bfb5m-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2", ResourceVersion:"685", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 32, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4578-0-0-p-557efd55ff", ContainerID:"", Pod:"csi-node-driver-bfb5m", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.118.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"caliee78a12b5d6", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:32:34.014742 containerd[1684]: 2026-01-14 01:32:33.983 [INFO][4506] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.118.133/32] ContainerID="ae9de380d71f2ed7cf6dfe48ad3f7eae1a058876fadddca6b5f18d2c7e20a62f" Namespace="calico-system" Pod="csi-node-driver-bfb5m" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-csi--node--driver--bfb5m-eth0" Jan 14 01:32:34.014742 containerd[1684]: 2026-01-14 01:32:33.983 [INFO][4506] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliee78a12b5d6 ContainerID="ae9de380d71f2ed7cf6dfe48ad3f7eae1a058876fadddca6b5f18d2c7e20a62f" Namespace="calico-system" Pod="csi-node-driver-bfb5m" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-csi--node--driver--bfb5m-eth0" Jan 14 01:32:34.014742 containerd[1684]: 2026-01-14 01:32:33.995 [INFO][4506] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ae9de380d71f2ed7cf6dfe48ad3f7eae1a058876fadddca6b5f18d2c7e20a62f" Namespace="calico-system" Pod="csi-node-driver-bfb5m" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-csi--node--driver--bfb5m-eth0" Jan 14 01:32:34.014742 containerd[1684]: 2026-01-14 01:32:33.995 [INFO][4506] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="ae9de380d71f2ed7cf6dfe48ad3f7eae1a058876fadddca6b5f18d2c7e20a62f" Namespace="calico-system" Pod="csi-node-driver-bfb5m" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-csi--node--driver--bfb5m-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4578--0--0--p--557efd55ff-k8s-csi--node--driver--bfb5m-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2", ResourceVersion:"685", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 32, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4578-0-0-p-557efd55ff", ContainerID:"ae9de380d71f2ed7cf6dfe48ad3f7eae1a058876fadddca6b5f18d2c7e20a62f", Pod:"csi-node-driver-bfb5m", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.118.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"caliee78a12b5d6", MAC:"26:00:06:f7:15:fe", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:32:34.014742 containerd[1684]: 2026-01-14 01:32:34.007 [INFO][4506] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="ae9de380d71f2ed7cf6dfe48ad3f7eae1a058876fadddca6b5f18d2c7e20a62f" Namespace="calico-system" Pod="csi-node-driver-bfb5m" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-csi--node--driver--bfb5m-eth0" Jan 14 01:32:34.025000 audit[4558]: NETFILTER_CFG table=filter:125 family=2 entries=19 op=nft_register_rule pid=4558 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:32:34.025000 audit[4558]: SYSCALL arch=c000003e syscall=46 success=yes exit=5992 a0=3 a1=7fffe3f44d60 a2=0 a3=7fffe3f44d4c items=0 ppid=3028 pid=4558 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:34.025000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:32:34.030000 audit[4558]: NETFILTER_CFG table=nat:126 family=2 entries=33 op=nft_register_chain pid=4558 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:32:34.030000 audit[4558]: SYSCALL arch=c000003e syscall=46 success=yes exit=13428 a0=3 a1=7fffe3f44d60 a2=0 a3=7fffe3f44d4c items=0 ppid=3028 pid=4558 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:34.030000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:32:34.055868 containerd[1684]: time="2026-01-14T01:32:34.055750335Z" level=info msg="connecting to shim ae9de380d71f2ed7cf6dfe48ad3f7eae1a058876fadddca6b5f18d2c7e20a62f" address="unix:///run/containerd/s/aad9364c302efa1d6e8f4c85235ff521cfe8e364d81003df80d1af24d31865b6" namespace=k8s.io protocol=ttrpc version=3 Jan 14 01:32:34.094031 systemd-networkd[1567]: cali915ac16338d: Link UP Jan 14 01:32:34.094223 systemd-networkd[1567]: cali915ac16338d: Gained carrier Jan 14 01:32:34.101121 systemd[1]: Started cri-containerd-ae9de380d71f2ed7cf6dfe48ad3f7eae1a058876fadddca6b5f18d2c7e20a62f.scope - libcontainer container ae9de380d71f2ed7cf6dfe48ad3f7eae1a058876fadddca6b5f18d2c7e20a62f. Jan 14 01:32:34.112335 containerd[1684]: 2026-01-14 01:32:33.844 [INFO][4494] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 14 01:32:34.112335 containerd[1684]: 2026-01-14 01:32:33.871 [INFO][4494] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4578--0--0--p--557efd55ff-k8s-coredns--668d6bf9bc--lkwn9-eth0 coredns-668d6bf9bc- kube-system e4543e16-4927-4a0f-aab5-23e9d79b2579 794 0 2026-01-14 01:31:53 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4578-0-0-p-557efd55ff coredns-668d6bf9bc-lkwn9 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali915ac16338d [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="3ab96a7a1ebbe8f834754fc647670efa99c73a85fe80ee8f34864f69bb7f0101" Namespace="kube-system" Pod="coredns-668d6bf9bc-lkwn9" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-coredns--668d6bf9bc--lkwn9-" Jan 14 01:32:34.112335 containerd[1684]: 2026-01-14 01:32:33.871 [INFO][4494] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="3ab96a7a1ebbe8f834754fc647670efa99c73a85fe80ee8f34864f69bb7f0101" Namespace="kube-system" Pod="coredns-668d6bf9bc-lkwn9" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-coredns--668d6bf9bc--lkwn9-eth0" Jan 14 01:32:34.112335 containerd[1684]: 2026-01-14 01:32:33.935 [INFO][4530] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3ab96a7a1ebbe8f834754fc647670efa99c73a85fe80ee8f34864f69bb7f0101" HandleID="k8s-pod-network.3ab96a7a1ebbe8f834754fc647670efa99c73a85fe80ee8f34864f69bb7f0101" Workload="ci--4578--0--0--p--557efd55ff-k8s-coredns--668d6bf9bc--lkwn9-eth0" Jan 14 01:32:34.112335 containerd[1684]: 2026-01-14 01:32:33.935 [INFO][4530] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="3ab96a7a1ebbe8f834754fc647670efa99c73a85fe80ee8f34864f69bb7f0101" HandleID="k8s-pod-network.3ab96a7a1ebbe8f834754fc647670efa99c73a85fe80ee8f34864f69bb7f0101" Workload="ci--4578--0--0--p--557efd55ff-k8s-coredns--668d6bf9bc--lkwn9-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5d00), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4578-0-0-p-557efd55ff", "pod":"coredns-668d6bf9bc-lkwn9", "timestamp":"2026-01-14 01:32:33.935489054 +0000 UTC"}, Hostname:"ci-4578-0-0-p-557efd55ff", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 14 01:32:34.112335 containerd[1684]: 2026-01-14 01:32:33.935 [INFO][4530] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 14 01:32:34.112335 containerd[1684]: 2026-01-14 01:32:33.978 [INFO][4530] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 14 01:32:34.112335 containerd[1684]: 2026-01-14 01:32:33.978 [INFO][4530] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4578-0-0-p-557efd55ff' Jan 14 01:32:34.112335 containerd[1684]: 2026-01-14 01:32:34.045 [INFO][4530] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.3ab96a7a1ebbe8f834754fc647670efa99c73a85fe80ee8f34864f69bb7f0101" host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:34.112335 containerd[1684]: 2026-01-14 01:32:34.052 [INFO][4530] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:34.112335 containerd[1684]: 2026-01-14 01:32:34.059 [INFO][4530] ipam/ipam.go 511: Trying affinity for 192.168.118.128/26 host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:34.112335 containerd[1684]: 2026-01-14 01:32:34.061 [INFO][4530] ipam/ipam.go 158: Attempting to load block cidr=192.168.118.128/26 host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:34.112335 containerd[1684]: 2026-01-14 01:32:34.065 [INFO][4530] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.118.128/26 host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:34.112335 containerd[1684]: 2026-01-14 01:32:34.065 [INFO][4530] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.118.128/26 handle="k8s-pod-network.3ab96a7a1ebbe8f834754fc647670efa99c73a85fe80ee8f34864f69bb7f0101" host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:34.112335 containerd[1684]: 2026-01-14 01:32:34.070 [INFO][4530] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.3ab96a7a1ebbe8f834754fc647670efa99c73a85fe80ee8f34864f69bb7f0101 Jan 14 01:32:34.112335 containerd[1684]: 2026-01-14 01:32:34.076 [INFO][4530] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.118.128/26 handle="k8s-pod-network.3ab96a7a1ebbe8f834754fc647670efa99c73a85fe80ee8f34864f69bb7f0101" host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:34.112335 containerd[1684]: 2026-01-14 01:32:34.081 [INFO][4530] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.118.134/26] block=192.168.118.128/26 handle="k8s-pod-network.3ab96a7a1ebbe8f834754fc647670efa99c73a85fe80ee8f34864f69bb7f0101" host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:34.112335 containerd[1684]: 2026-01-14 01:32:34.081 [INFO][4530] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.118.134/26] handle="k8s-pod-network.3ab96a7a1ebbe8f834754fc647670efa99c73a85fe80ee8f34864f69bb7f0101" host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:34.112335 containerd[1684]: 2026-01-14 01:32:34.082 [INFO][4530] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 14 01:32:34.112335 containerd[1684]: 2026-01-14 01:32:34.082 [INFO][4530] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.118.134/26] IPv6=[] ContainerID="3ab96a7a1ebbe8f834754fc647670efa99c73a85fe80ee8f34864f69bb7f0101" HandleID="k8s-pod-network.3ab96a7a1ebbe8f834754fc647670efa99c73a85fe80ee8f34864f69bb7f0101" Workload="ci--4578--0--0--p--557efd55ff-k8s-coredns--668d6bf9bc--lkwn9-eth0" Jan 14 01:32:34.112873 containerd[1684]: 2026-01-14 01:32:34.087 [INFO][4494] cni-plugin/k8s.go 418: Populated endpoint ContainerID="3ab96a7a1ebbe8f834754fc647670efa99c73a85fe80ee8f34864f69bb7f0101" Namespace="kube-system" Pod="coredns-668d6bf9bc-lkwn9" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-coredns--668d6bf9bc--lkwn9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4578--0--0--p--557efd55ff-k8s-coredns--668d6bf9bc--lkwn9-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"e4543e16-4927-4a0f-aab5-23e9d79b2579", ResourceVersion:"794", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 31, 53, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4578-0-0-p-557efd55ff", ContainerID:"", Pod:"coredns-668d6bf9bc-lkwn9", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.118.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali915ac16338d", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:32:34.112873 containerd[1684]: 2026-01-14 01:32:34.087 [INFO][4494] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.118.134/32] ContainerID="3ab96a7a1ebbe8f834754fc647670efa99c73a85fe80ee8f34864f69bb7f0101" Namespace="kube-system" Pod="coredns-668d6bf9bc-lkwn9" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-coredns--668d6bf9bc--lkwn9-eth0" Jan 14 01:32:34.112873 containerd[1684]: 2026-01-14 01:32:34.087 [INFO][4494] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali915ac16338d ContainerID="3ab96a7a1ebbe8f834754fc647670efa99c73a85fe80ee8f34864f69bb7f0101" Namespace="kube-system" Pod="coredns-668d6bf9bc-lkwn9" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-coredns--668d6bf9bc--lkwn9-eth0" Jan 14 01:32:34.112873 containerd[1684]: 2026-01-14 01:32:34.094 [INFO][4494] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3ab96a7a1ebbe8f834754fc647670efa99c73a85fe80ee8f34864f69bb7f0101" Namespace="kube-system" Pod="coredns-668d6bf9bc-lkwn9" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-coredns--668d6bf9bc--lkwn9-eth0" Jan 14 01:32:34.112873 containerd[1684]: 2026-01-14 01:32:34.095 [INFO][4494] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="3ab96a7a1ebbe8f834754fc647670efa99c73a85fe80ee8f34864f69bb7f0101" Namespace="kube-system" Pod="coredns-668d6bf9bc-lkwn9" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-coredns--668d6bf9bc--lkwn9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4578--0--0--p--557efd55ff-k8s-coredns--668d6bf9bc--lkwn9-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"e4543e16-4927-4a0f-aab5-23e9d79b2579", ResourceVersion:"794", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 31, 53, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4578-0-0-p-557efd55ff", ContainerID:"3ab96a7a1ebbe8f834754fc647670efa99c73a85fe80ee8f34864f69bb7f0101", Pod:"coredns-668d6bf9bc-lkwn9", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.118.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali915ac16338d", MAC:"fa:bb:ab:0a:1e:b7", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:32:34.112873 containerd[1684]: 2026-01-14 01:32:34.110 [INFO][4494] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="3ab96a7a1ebbe8f834754fc647670efa99c73a85fe80ee8f34864f69bb7f0101" Namespace="kube-system" Pod="coredns-668d6bf9bc-lkwn9" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-coredns--668d6bf9bc--lkwn9-eth0" Jan 14 01:32:34.115000 audit: BPF prog-id=200 op=LOAD Jan 14 01:32:34.116000 audit: BPF prog-id=201 op=LOAD Jan 14 01:32:34.116000 audit[4581]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000106238 a2=98 a3=0 items=0 ppid=4569 pid=4581 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:34.116000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6165396465333830643731663265643763663664666534386164336637 Jan 14 01:32:34.116000 audit: BPF prog-id=201 op=UNLOAD Jan 14 01:32:34.116000 audit[4581]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4569 pid=4581 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:34.116000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6165396465333830643731663265643763663664666534386164336637 Jan 14 01:32:34.116000 audit: BPF prog-id=202 op=LOAD Jan 14 01:32:34.116000 audit[4581]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000106488 a2=98 a3=0 items=0 ppid=4569 pid=4581 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:34.116000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6165396465333830643731663265643763663664666534386164336637 Jan 14 01:32:34.116000 audit: BPF prog-id=203 op=LOAD Jan 14 01:32:34.116000 audit[4581]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000106218 a2=98 a3=0 items=0 ppid=4569 pid=4581 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:34.116000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6165396465333830643731663265643763663664666534386164336637 Jan 14 01:32:34.116000 audit: BPF prog-id=203 op=UNLOAD Jan 14 01:32:34.116000 audit[4581]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4569 pid=4581 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:34.116000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6165396465333830643731663265643763663664666534386164336637 Jan 14 01:32:34.116000 audit: BPF prog-id=202 op=UNLOAD Jan 14 01:32:34.116000 audit[4581]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4569 pid=4581 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:34.116000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6165396465333830643731663265643763663664666534386164336637 Jan 14 01:32:34.116000 audit: BPF prog-id=204 op=LOAD Jan 14 01:32:34.116000 audit[4581]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001066e8 a2=98 a3=0 items=0 ppid=4569 pid=4581 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:34.116000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6165396465333830643731663265643763663664666534386164336637 Jan 14 01:32:34.149246 containerd[1684]: time="2026-01-14T01:32:34.149126948Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-bfb5m,Uid:e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2,Namespace:calico-system,Attempt:0,} returns sandbox id \"ae9de380d71f2ed7cf6dfe48ad3f7eae1a058876fadddca6b5f18d2c7e20a62f\"" Jan 14 01:32:34.156003 containerd[1684]: time="2026-01-14T01:32:34.155971178Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 14 01:32:34.174259 containerd[1684]: time="2026-01-14T01:32:34.173706987Z" level=info msg="connecting to shim 3ab96a7a1ebbe8f834754fc647670efa99c73a85fe80ee8f34864f69bb7f0101" address="unix:///run/containerd/s/24c8cc25ba927ac1257bd2318295486f4cf83b01e8cb9b5798e760f2107cba6c" namespace=k8s.io protocol=ttrpc version=3 Jan 14 01:32:34.210670 systemd-networkd[1567]: cali3cf3d0cfeb0: Link UP Jan 14 01:32:34.212003 systemd-networkd[1567]: cali3cf3d0cfeb0: Gained carrier Jan 14 01:32:34.233070 systemd[1]: Started cri-containerd-3ab96a7a1ebbe8f834754fc647670efa99c73a85fe80ee8f34864f69bb7f0101.scope - libcontainer container 3ab96a7a1ebbe8f834754fc647670efa99c73a85fe80ee8f34864f69bb7f0101. Jan 14 01:32:34.234133 containerd[1684]: 2026-01-14 01:32:33.870 [INFO][4493] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 14 01:32:34.234133 containerd[1684]: 2026-01-14 01:32:33.888 [INFO][4493] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4578--0--0--p--557efd55ff-k8s-goldmane--666569f655--gknfk-eth0 goldmane-666569f655- calico-system 9e0350f1-074c-4801-8433-6b63afe081c2 795 0 2026-01-14 01:32:04 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:666569f655 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4578-0-0-p-557efd55ff goldmane-666569f655-gknfk eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali3cf3d0cfeb0 [] [] }} ContainerID="16920b7dc6e39f43ec60aa5080a55c873879c2518b9981a20169f3e49def380b" Namespace="calico-system" Pod="goldmane-666569f655-gknfk" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-goldmane--666569f655--gknfk-" Jan 14 01:32:34.234133 containerd[1684]: 2026-01-14 01:32:33.888 [INFO][4493] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="16920b7dc6e39f43ec60aa5080a55c873879c2518b9981a20169f3e49def380b" Namespace="calico-system" Pod="goldmane-666569f655-gknfk" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-goldmane--666569f655--gknfk-eth0" Jan 14 01:32:34.234133 containerd[1684]: 2026-01-14 01:32:33.938 [INFO][4540] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="16920b7dc6e39f43ec60aa5080a55c873879c2518b9981a20169f3e49def380b" HandleID="k8s-pod-network.16920b7dc6e39f43ec60aa5080a55c873879c2518b9981a20169f3e49def380b" Workload="ci--4578--0--0--p--557efd55ff-k8s-goldmane--666569f655--gknfk-eth0" Jan 14 01:32:34.234133 containerd[1684]: 2026-01-14 01:32:33.939 [INFO][4540] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="16920b7dc6e39f43ec60aa5080a55c873879c2518b9981a20169f3e49def380b" HandleID="k8s-pod-network.16920b7dc6e39f43ec60aa5080a55c873879c2518b9981a20169f3e49def380b" Workload="ci--4578--0--0--p--557efd55ff-k8s-goldmane--666569f655--gknfk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5600), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4578-0-0-p-557efd55ff", "pod":"goldmane-666569f655-gknfk", "timestamp":"2026-01-14 01:32:33.938876234 +0000 UTC"}, Hostname:"ci-4578-0-0-p-557efd55ff", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 14 01:32:34.234133 containerd[1684]: 2026-01-14 01:32:33.939 [INFO][4540] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 14 01:32:34.234133 containerd[1684]: 2026-01-14 01:32:34.082 [INFO][4540] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 14 01:32:34.234133 containerd[1684]: 2026-01-14 01:32:34.082 [INFO][4540] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4578-0-0-p-557efd55ff' Jan 14 01:32:34.234133 containerd[1684]: 2026-01-14 01:32:34.146 [INFO][4540] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.16920b7dc6e39f43ec60aa5080a55c873879c2518b9981a20169f3e49def380b" host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:34.234133 containerd[1684]: 2026-01-14 01:32:34.157 [INFO][4540] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:34.234133 containerd[1684]: 2026-01-14 01:32:34.169 [INFO][4540] ipam/ipam.go 511: Trying affinity for 192.168.118.128/26 host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:34.234133 containerd[1684]: 2026-01-14 01:32:34.172 [INFO][4540] ipam/ipam.go 158: Attempting to load block cidr=192.168.118.128/26 host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:34.234133 containerd[1684]: 2026-01-14 01:32:34.175 [INFO][4540] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.118.128/26 host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:34.234133 containerd[1684]: 2026-01-14 01:32:34.175 [INFO][4540] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.118.128/26 handle="k8s-pod-network.16920b7dc6e39f43ec60aa5080a55c873879c2518b9981a20169f3e49def380b" host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:34.234133 containerd[1684]: 2026-01-14 01:32:34.177 [INFO][4540] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.16920b7dc6e39f43ec60aa5080a55c873879c2518b9981a20169f3e49def380b Jan 14 01:32:34.234133 containerd[1684]: 2026-01-14 01:32:34.191 [INFO][4540] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.118.128/26 handle="k8s-pod-network.16920b7dc6e39f43ec60aa5080a55c873879c2518b9981a20169f3e49def380b" host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:34.234133 containerd[1684]: 2026-01-14 01:32:34.201 [INFO][4540] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.118.135/26] block=192.168.118.128/26 handle="k8s-pod-network.16920b7dc6e39f43ec60aa5080a55c873879c2518b9981a20169f3e49def380b" host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:34.234133 containerd[1684]: 2026-01-14 01:32:34.202 [INFO][4540] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.118.135/26] handle="k8s-pod-network.16920b7dc6e39f43ec60aa5080a55c873879c2518b9981a20169f3e49def380b" host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:34.234133 containerd[1684]: 2026-01-14 01:32:34.202 [INFO][4540] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 14 01:32:34.234133 containerd[1684]: 2026-01-14 01:32:34.202 [INFO][4540] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.118.135/26] IPv6=[] ContainerID="16920b7dc6e39f43ec60aa5080a55c873879c2518b9981a20169f3e49def380b" HandleID="k8s-pod-network.16920b7dc6e39f43ec60aa5080a55c873879c2518b9981a20169f3e49def380b" Workload="ci--4578--0--0--p--557efd55ff-k8s-goldmane--666569f655--gknfk-eth0" Jan 14 01:32:34.235265 containerd[1684]: 2026-01-14 01:32:34.205 [INFO][4493] cni-plugin/k8s.go 418: Populated endpoint ContainerID="16920b7dc6e39f43ec60aa5080a55c873879c2518b9981a20169f3e49def380b" Namespace="calico-system" Pod="goldmane-666569f655-gknfk" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-goldmane--666569f655--gknfk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4578--0--0--p--557efd55ff-k8s-goldmane--666569f655--gknfk-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"9e0350f1-074c-4801-8433-6b63afe081c2", ResourceVersion:"795", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 32, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4578-0-0-p-557efd55ff", ContainerID:"", Pod:"goldmane-666569f655-gknfk", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.118.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali3cf3d0cfeb0", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:32:34.235265 containerd[1684]: 2026-01-14 01:32:34.205 [INFO][4493] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.118.135/32] ContainerID="16920b7dc6e39f43ec60aa5080a55c873879c2518b9981a20169f3e49def380b" Namespace="calico-system" Pod="goldmane-666569f655-gknfk" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-goldmane--666569f655--gknfk-eth0" Jan 14 01:32:34.235265 containerd[1684]: 2026-01-14 01:32:34.205 [INFO][4493] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3cf3d0cfeb0 ContainerID="16920b7dc6e39f43ec60aa5080a55c873879c2518b9981a20169f3e49def380b" Namespace="calico-system" Pod="goldmane-666569f655-gknfk" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-goldmane--666569f655--gknfk-eth0" Jan 14 01:32:34.235265 containerd[1684]: 2026-01-14 01:32:34.213 [INFO][4493] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="16920b7dc6e39f43ec60aa5080a55c873879c2518b9981a20169f3e49def380b" Namespace="calico-system" Pod="goldmane-666569f655-gknfk" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-goldmane--666569f655--gknfk-eth0" Jan 14 01:32:34.235265 containerd[1684]: 2026-01-14 01:32:34.214 [INFO][4493] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="16920b7dc6e39f43ec60aa5080a55c873879c2518b9981a20169f3e49def380b" Namespace="calico-system" Pod="goldmane-666569f655-gknfk" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-goldmane--666569f655--gknfk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4578--0--0--p--557efd55ff-k8s-goldmane--666569f655--gknfk-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"9e0350f1-074c-4801-8433-6b63afe081c2", ResourceVersion:"795", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 32, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4578-0-0-p-557efd55ff", ContainerID:"16920b7dc6e39f43ec60aa5080a55c873879c2518b9981a20169f3e49def380b", Pod:"goldmane-666569f655-gknfk", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.118.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali3cf3d0cfeb0", MAC:"ee:5f:d5:fe:0e:ed", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:32:34.235265 containerd[1684]: 2026-01-14 01:32:34.229 [INFO][4493] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="16920b7dc6e39f43ec60aa5080a55c873879c2518b9981a20169f3e49def380b" Namespace="calico-system" Pod="goldmane-666569f655-gknfk" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-goldmane--666569f655--gknfk-eth0" Jan 14 01:32:34.269564 containerd[1684]: time="2026-01-14T01:32:34.268273626Z" level=info msg="connecting to shim 16920b7dc6e39f43ec60aa5080a55c873879c2518b9981a20169f3e49def380b" address="unix:///run/containerd/s/add80b345a4c2b492ef2b764c0af7d2643fcfd2fc853e140e5eba7c400ea6137" namespace=k8s.io protocol=ttrpc version=3 Jan 14 01:32:34.284000 audit: BPF prog-id=205 op=LOAD Jan 14 01:32:34.284000 audit: BPF prog-id=206 op=LOAD Jan 14 01:32:34.284000 audit[4647]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=4630 pid=4647 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:34.284000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3361623936613761316562626538663833343735346663363437363730 Jan 14 01:32:34.284000 audit: BPF prog-id=206 op=UNLOAD Jan 14 01:32:34.284000 audit[4647]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4630 pid=4647 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:34.284000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3361623936613761316562626538663833343735346663363437363730 Jan 14 01:32:34.286000 audit: BPF prog-id=207 op=LOAD Jan 14 01:32:34.286000 audit[4647]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=4630 pid=4647 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:34.286000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3361623936613761316562626538663833343735346663363437363730 Jan 14 01:32:34.286000 audit: BPF prog-id=208 op=LOAD Jan 14 01:32:34.286000 audit[4647]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=4630 pid=4647 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:34.286000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3361623936613761316562626538663833343735346663363437363730 Jan 14 01:32:34.286000 audit: BPF prog-id=208 op=UNLOAD Jan 14 01:32:34.286000 audit[4647]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4630 pid=4647 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:34.286000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3361623936613761316562626538663833343735346663363437363730 Jan 14 01:32:34.286000 audit: BPF prog-id=207 op=UNLOAD Jan 14 01:32:34.286000 audit[4647]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4630 pid=4647 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:34.286000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3361623936613761316562626538663833343735346663363437363730 Jan 14 01:32:34.286000 audit: BPF prog-id=209 op=LOAD Jan 14 01:32:34.286000 audit[4647]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=4630 pid=4647 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:34.286000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3361623936613761316562626538663833343735346663363437363730 Jan 14 01:32:34.304065 systemd[1]: Started cri-containerd-16920b7dc6e39f43ec60aa5080a55c873879c2518b9981a20169f3e49def380b.scope - libcontainer container 16920b7dc6e39f43ec60aa5080a55c873879c2518b9981a20169f3e49def380b. Jan 14 01:32:34.379474 containerd[1684]: time="2026-01-14T01:32:34.379397516Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-lkwn9,Uid:e4543e16-4927-4a0f-aab5-23e9d79b2579,Namespace:kube-system,Attempt:0,} returns sandbox id \"3ab96a7a1ebbe8f834754fc647670efa99c73a85fe80ee8f34864f69bb7f0101\"" Jan 14 01:32:34.380000 audit: BPF prog-id=210 op=LOAD Jan 14 01:32:34.380000 audit: BPF prog-id=211 op=LOAD Jan 14 01:32:34.380000 audit[4692]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=4681 pid=4692 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:34.380000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3136393230623764633665333966343365633630616135303830613535 Jan 14 01:32:34.380000 audit: BPF prog-id=211 op=UNLOAD Jan 14 01:32:34.380000 audit[4692]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4681 pid=4692 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:34.380000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3136393230623764633665333966343365633630616135303830613535 Jan 14 01:32:34.380000 audit: BPF prog-id=212 op=LOAD Jan 14 01:32:34.380000 audit[4692]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=4681 pid=4692 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:34.380000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3136393230623764633665333966343365633630616135303830613535 Jan 14 01:32:34.381000 audit: BPF prog-id=213 op=LOAD Jan 14 01:32:34.381000 audit[4692]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=4681 pid=4692 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:34.381000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3136393230623764633665333966343365633630616135303830613535 Jan 14 01:32:34.381000 audit: BPF prog-id=213 op=UNLOAD Jan 14 01:32:34.381000 audit[4692]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4681 pid=4692 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:34.381000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3136393230623764633665333966343365633630616135303830613535 Jan 14 01:32:34.381000 audit: BPF prog-id=212 op=UNLOAD Jan 14 01:32:34.381000 audit[4692]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4681 pid=4692 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:34.381000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3136393230623764633665333966343365633630616135303830613535 Jan 14 01:32:34.381000 audit: BPF prog-id=214 op=LOAD Jan 14 01:32:34.381000 audit[4692]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=4681 pid=4692 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:34.381000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3136393230623764633665333966343365633630616135303830613535 Jan 14 01:32:34.386723 containerd[1684]: time="2026-01-14T01:32:34.386653033Z" level=info msg="CreateContainer within sandbox \"3ab96a7a1ebbe8f834754fc647670efa99c73a85fe80ee8f34864f69bb7f0101\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 14 01:32:34.442097 containerd[1684]: time="2026-01-14T01:32:34.442026208Z" level=info msg="Container ec5aa91e1d50ae77bbd883909ac647a127f71f0b05e5a6ed670a783fcdb9090c: CDI devices from CRI Config.CDIDevices: []" Jan 14 01:32:34.444904 containerd[1684]: time="2026-01-14T01:32:34.444872877Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-gknfk,Uid:9e0350f1-074c-4801-8433-6b63afe081c2,Namespace:calico-system,Attempt:0,} returns sandbox id \"16920b7dc6e39f43ec60aa5080a55c873879c2518b9981a20169f3e49def380b\"" Jan 14 01:32:34.452015 containerd[1684]: time="2026-01-14T01:32:34.451977473Z" level=info msg="CreateContainer within sandbox \"3ab96a7a1ebbe8f834754fc647670efa99c73a85fe80ee8f34864f69bb7f0101\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"ec5aa91e1d50ae77bbd883909ac647a127f71f0b05e5a6ed670a783fcdb9090c\"" Jan 14 01:32:34.452781 containerd[1684]: time="2026-01-14T01:32:34.452757124Z" level=info msg="StartContainer for \"ec5aa91e1d50ae77bbd883909ac647a127f71f0b05e5a6ed670a783fcdb9090c\"" Jan 14 01:32:34.454976 containerd[1684]: time="2026-01-14T01:32:34.454925016Z" level=info msg="connecting to shim ec5aa91e1d50ae77bbd883909ac647a127f71f0b05e5a6ed670a783fcdb9090c" address="unix:///run/containerd/s/24c8cc25ba927ac1257bd2318295486f4cf83b01e8cb9b5798e760f2107cba6c" protocol=ttrpc version=3 Jan 14 01:32:34.468010 systemd-networkd[1567]: cali7557b8384f3: Gained IPv6LL Jan 14 01:32:34.486130 systemd[1]: Started cri-containerd-ec5aa91e1d50ae77bbd883909ac647a127f71f0b05e5a6ed670a783fcdb9090c.scope - libcontainer container ec5aa91e1d50ae77bbd883909ac647a127f71f0b05e5a6ed670a783fcdb9090c. Jan 14 01:32:34.499000 audit: BPF prog-id=215 op=LOAD Jan 14 01:32:34.501000 audit: BPF prog-id=216 op=LOAD Jan 14 01:32:34.501000 audit[4729]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=4630 pid=4729 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:34.501000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6563356161393165316435306165373762626438383339303961633634 Jan 14 01:32:34.501000 audit: BPF prog-id=216 op=UNLOAD Jan 14 01:32:34.501000 audit[4729]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4630 pid=4729 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:34.501000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6563356161393165316435306165373762626438383339303961633634 Jan 14 01:32:34.501000 audit: BPF prog-id=217 op=LOAD Jan 14 01:32:34.501000 audit[4729]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=4630 pid=4729 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:34.501000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6563356161393165316435306165373762626438383339303961633634 Jan 14 01:32:34.501000 audit: BPF prog-id=218 op=LOAD Jan 14 01:32:34.501000 audit[4729]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=4630 pid=4729 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:34.501000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6563356161393165316435306165373762626438383339303961633634 Jan 14 01:32:34.501000 audit: BPF prog-id=218 op=UNLOAD Jan 14 01:32:34.501000 audit[4729]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4630 pid=4729 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:34.501000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6563356161393165316435306165373762626438383339303961633634 Jan 14 01:32:34.501000 audit: BPF prog-id=217 op=UNLOAD Jan 14 01:32:34.501000 audit[4729]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4630 pid=4729 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:34.501000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6563356161393165316435306165373762626438383339303961633634 Jan 14 01:32:34.501000 audit: BPF prog-id=219 op=LOAD Jan 14 01:32:34.501000 audit[4729]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=4630 pid=4729 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:34.501000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6563356161393165316435306165373762626438383339303961633634 Jan 14 01:32:34.509831 containerd[1684]: time="2026-01-14T01:32:34.509720939Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:32:34.511480 containerd[1684]: time="2026-01-14T01:32:34.511451561Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 14 01:32:34.511735 containerd[1684]: time="2026-01-14T01:32:34.511544343Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 14 01:32:34.511778 kubelet[2924]: E0114 01:32:34.511687 2924 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 14 01:32:34.511778 kubelet[2924]: E0114 01:32:34.511722 2924 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 14 01:32:34.512318 containerd[1684]: time="2026-01-14T01:32:34.512211184Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 14 01:32:34.512443 kubelet[2924]: E0114 01:32:34.512290 2924 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-545mr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-bfb5m_calico-system(e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 14 01:32:34.523860 containerd[1684]: time="2026-01-14T01:32:34.523804586Z" level=info msg="StartContainer for \"ec5aa91e1d50ae77bbd883909ac647a127f71f0b05e5a6ed670a783fcdb9090c\" returns successfully" Jan 14 01:32:34.776266 containerd[1684]: time="2026-01-14T01:32:34.776096086Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-84b7d46b9c-rw586,Uid:7534bfa4-9af8-4640-8306-673448a61bb0,Namespace:calico-system,Attempt:0,}" Jan 14 01:32:34.828472 containerd[1684]: time="2026-01-14T01:32:34.828425372Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:32:34.830445 containerd[1684]: time="2026-01-14T01:32:34.830393182Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 14 01:32:34.830530 containerd[1684]: time="2026-01-14T01:32:34.830485304Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 14 01:32:34.831398 kubelet[2924]: E0114 01:32:34.831360 2924 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 14 01:32:34.831398 kubelet[2924]: E0114 01:32:34.831402 2924 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 14 01:32:34.831642 kubelet[2924]: E0114 01:32:34.831603 2924 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pd8ps,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-gknfk_calico-system(9e0350f1-074c-4801-8433-6b63afe081c2): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 14 01:32:34.832010 containerd[1684]: time="2026-01-14T01:32:34.831987489Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 14 01:32:34.833572 kubelet[2924]: E0114 01:32:34.832919 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-gknfk" podUID="9e0350f1-074c-4801-8433-6b63afe081c2" Jan 14 01:32:34.902635 systemd-networkd[1567]: cali1cc7bd4d3c1: Link UP Jan 14 01:32:34.902782 systemd-networkd[1567]: cali1cc7bd4d3c1: Gained carrier Jan 14 01:32:34.907660 kubelet[2924]: I0114 01:32:34.907631 2924 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 14 01:32:34.924815 containerd[1684]: 2026-01-14 01:32:34.819 [INFO][4761] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 14 01:32:34.924815 containerd[1684]: 2026-01-14 01:32:34.833 [INFO][4761] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4578--0--0--p--557efd55ff-k8s-calico--kube--controllers--84b7d46b9c--rw586-eth0 calico-kube-controllers-84b7d46b9c- calico-system 7534bfa4-9af8-4640-8306-673448a61bb0 798 0 2026-01-14 01:32:06 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:84b7d46b9c projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4578-0-0-p-557efd55ff calico-kube-controllers-84b7d46b9c-rw586 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali1cc7bd4d3c1 [] [] }} ContainerID="ae0b959ae7cc64b4d94c90277f991f4dcef59aee81b320265c738b40eaad124a" Namespace="calico-system" Pod="calico-kube-controllers-84b7d46b9c-rw586" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-calico--kube--controllers--84b7d46b9c--rw586-" Jan 14 01:32:34.924815 containerd[1684]: 2026-01-14 01:32:34.833 [INFO][4761] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="ae0b959ae7cc64b4d94c90277f991f4dcef59aee81b320265c738b40eaad124a" Namespace="calico-system" Pod="calico-kube-controllers-84b7d46b9c-rw586" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-calico--kube--controllers--84b7d46b9c--rw586-eth0" Jan 14 01:32:34.924815 containerd[1684]: 2026-01-14 01:32:34.860 [INFO][4772] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ae0b959ae7cc64b4d94c90277f991f4dcef59aee81b320265c738b40eaad124a" HandleID="k8s-pod-network.ae0b959ae7cc64b4d94c90277f991f4dcef59aee81b320265c738b40eaad124a" Workload="ci--4578--0--0--p--557efd55ff-k8s-calico--kube--controllers--84b7d46b9c--rw586-eth0" Jan 14 01:32:34.924815 containerd[1684]: 2026-01-14 01:32:34.860 [INFO][4772] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="ae0b959ae7cc64b4d94c90277f991f4dcef59aee81b320265c738b40eaad124a" HandleID="k8s-pod-network.ae0b959ae7cc64b4d94c90277f991f4dcef59aee81b320265c738b40eaad124a" Workload="ci--4578--0--0--p--557efd55ff-k8s-calico--kube--controllers--84b7d46b9c--rw586-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024efe0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4578-0-0-p-557efd55ff", "pod":"calico-kube-controllers-84b7d46b9c-rw586", "timestamp":"2026-01-14 01:32:34.860520952 +0000 UTC"}, Hostname:"ci-4578-0-0-p-557efd55ff", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 14 01:32:34.924815 containerd[1684]: 2026-01-14 01:32:34.860 [INFO][4772] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 14 01:32:34.924815 containerd[1684]: 2026-01-14 01:32:34.860 [INFO][4772] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 14 01:32:34.924815 containerd[1684]: 2026-01-14 01:32:34.861 [INFO][4772] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4578-0-0-p-557efd55ff' Jan 14 01:32:34.924815 containerd[1684]: 2026-01-14 01:32:34.868 [INFO][4772] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.ae0b959ae7cc64b4d94c90277f991f4dcef59aee81b320265c738b40eaad124a" host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:34.924815 containerd[1684]: 2026-01-14 01:32:34.873 [INFO][4772] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:34.924815 containerd[1684]: 2026-01-14 01:32:34.876 [INFO][4772] ipam/ipam.go 511: Trying affinity for 192.168.118.128/26 host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:34.924815 containerd[1684]: 2026-01-14 01:32:34.878 [INFO][4772] ipam/ipam.go 158: Attempting to load block cidr=192.168.118.128/26 host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:34.924815 containerd[1684]: 2026-01-14 01:32:34.881 [INFO][4772] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.118.128/26 host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:34.924815 containerd[1684]: 2026-01-14 01:32:34.881 [INFO][4772] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.118.128/26 handle="k8s-pod-network.ae0b959ae7cc64b4d94c90277f991f4dcef59aee81b320265c738b40eaad124a" host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:34.924815 containerd[1684]: 2026-01-14 01:32:34.882 [INFO][4772] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.ae0b959ae7cc64b4d94c90277f991f4dcef59aee81b320265c738b40eaad124a Jan 14 01:32:34.924815 containerd[1684]: 2026-01-14 01:32:34.885 [INFO][4772] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.118.128/26 handle="k8s-pod-network.ae0b959ae7cc64b4d94c90277f991f4dcef59aee81b320265c738b40eaad124a" host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:34.924815 containerd[1684]: 2026-01-14 01:32:34.893 [INFO][4772] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.118.136/26] block=192.168.118.128/26 handle="k8s-pod-network.ae0b959ae7cc64b4d94c90277f991f4dcef59aee81b320265c738b40eaad124a" host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:34.924815 containerd[1684]: 2026-01-14 01:32:34.893 [INFO][4772] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.118.136/26] handle="k8s-pod-network.ae0b959ae7cc64b4d94c90277f991f4dcef59aee81b320265c738b40eaad124a" host="ci-4578-0-0-p-557efd55ff" Jan 14 01:32:34.924815 containerd[1684]: 2026-01-14 01:32:34.893 [INFO][4772] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 14 01:32:34.924815 containerd[1684]: 2026-01-14 01:32:34.893 [INFO][4772] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.118.136/26] IPv6=[] ContainerID="ae0b959ae7cc64b4d94c90277f991f4dcef59aee81b320265c738b40eaad124a" HandleID="k8s-pod-network.ae0b959ae7cc64b4d94c90277f991f4dcef59aee81b320265c738b40eaad124a" Workload="ci--4578--0--0--p--557efd55ff-k8s-calico--kube--controllers--84b7d46b9c--rw586-eth0" Jan 14 01:32:34.925397 containerd[1684]: 2026-01-14 01:32:34.896 [INFO][4761] cni-plugin/k8s.go 418: Populated endpoint ContainerID="ae0b959ae7cc64b4d94c90277f991f4dcef59aee81b320265c738b40eaad124a" Namespace="calico-system" Pod="calico-kube-controllers-84b7d46b9c-rw586" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-calico--kube--controllers--84b7d46b9c--rw586-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4578--0--0--p--557efd55ff-k8s-calico--kube--controllers--84b7d46b9c--rw586-eth0", GenerateName:"calico-kube-controllers-84b7d46b9c-", Namespace:"calico-system", SelfLink:"", UID:"7534bfa4-9af8-4640-8306-673448a61bb0", ResourceVersion:"798", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 32, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"84b7d46b9c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4578-0-0-p-557efd55ff", ContainerID:"", Pod:"calico-kube-controllers-84b7d46b9c-rw586", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.118.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali1cc7bd4d3c1", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:32:34.925397 containerd[1684]: 2026-01-14 01:32:34.896 [INFO][4761] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.118.136/32] ContainerID="ae0b959ae7cc64b4d94c90277f991f4dcef59aee81b320265c738b40eaad124a" Namespace="calico-system" Pod="calico-kube-controllers-84b7d46b9c-rw586" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-calico--kube--controllers--84b7d46b9c--rw586-eth0" Jan 14 01:32:34.925397 containerd[1684]: 2026-01-14 01:32:34.896 [INFO][4761] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1cc7bd4d3c1 ContainerID="ae0b959ae7cc64b4d94c90277f991f4dcef59aee81b320265c738b40eaad124a" Namespace="calico-system" Pod="calico-kube-controllers-84b7d46b9c-rw586" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-calico--kube--controllers--84b7d46b9c--rw586-eth0" Jan 14 01:32:34.925397 containerd[1684]: 2026-01-14 01:32:34.900 [INFO][4761] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ae0b959ae7cc64b4d94c90277f991f4dcef59aee81b320265c738b40eaad124a" Namespace="calico-system" Pod="calico-kube-controllers-84b7d46b9c-rw586" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-calico--kube--controllers--84b7d46b9c--rw586-eth0" Jan 14 01:32:34.925397 containerd[1684]: 2026-01-14 01:32:34.906 [INFO][4761] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="ae0b959ae7cc64b4d94c90277f991f4dcef59aee81b320265c738b40eaad124a" Namespace="calico-system" Pod="calico-kube-controllers-84b7d46b9c-rw586" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-calico--kube--controllers--84b7d46b9c--rw586-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4578--0--0--p--557efd55ff-k8s-calico--kube--controllers--84b7d46b9c--rw586-eth0", GenerateName:"calico-kube-controllers-84b7d46b9c-", Namespace:"calico-system", SelfLink:"", UID:"7534bfa4-9af8-4640-8306-673448a61bb0", ResourceVersion:"798", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 32, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"84b7d46b9c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4578-0-0-p-557efd55ff", ContainerID:"ae0b959ae7cc64b4d94c90277f991f4dcef59aee81b320265c738b40eaad124a", Pod:"calico-kube-controllers-84b7d46b9c-rw586", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.118.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali1cc7bd4d3c1", MAC:"8e:a6:49:17:65:17", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:32:34.925397 containerd[1684]: 2026-01-14 01:32:34.922 [INFO][4761] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="ae0b959ae7cc64b4d94c90277f991f4dcef59aee81b320265c738b40eaad124a" Namespace="calico-system" Pod="calico-kube-controllers-84b7d46b9c-rw586" WorkloadEndpoint="ci--4578--0--0--p--557efd55ff-k8s-calico--kube--controllers--84b7d46b9c--rw586-eth0" Jan 14 01:32:34.932079 kubelet[2924]: E0114 01:32:34.932049 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-gknfk" podUID="9e0350f1-074c-4801-8433-6b63afe081c2" Jan 14 01:32:34.975506 kubelet[2924]: I0114 01:32:34.975329 2924 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-lkwn9" podStartSLOduration=41.975314363 podStartE2EDuration="41.975314363s" podCreationTimestamp="2026-01-14 01:31:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-14 01:32:34.975093201 +0000 UTC m=+48.307322453" watchObservedRunningTime="2026-01-14 01:32:34.975314363 +0000 UTC m=+48.307543607" Jan 14 01:32:34.978870 containerd[1684]: time="2026-01-14T01:32:34.977142664Z" level=info msg="connecting to shim ae0b959ae7cc64b4d94c90277f991f4dcef59aee81b320265c738b40eaad124a" address="unix:///run/containerd/s/cb71faef0982c4d9da2c22ad51bb05095a6f65f3190719bdacab325588bf003d" namespace=k8s.io protocol=ttrpc version=3 Jan 14 01:32:35.001000 audit[4813]: NETFILTER_CFG table=filter:127 family=2 entries=15 op=nft_register_rule pid=4813 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:32:35.001000 audit[4813]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7ffee56cd9a0 a2=0 a3=7ffee56cd98c items=0 ppid=3028 pid=4813 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:35.001000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:32:35.007000 audit[4813]: NETFILTER_CFG table=nat:128 family=2 entries=49 op=nft_register_chain pid=4813 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:32:35.007000 audit[4813]: SYSCALL arch=c000003e syscall=46 success=yes exit=17004 a0=3 a1=7ffee56cd9a0 a2=0 a3=7ffee56cd98c items=0 ppid=3028 pid=4813 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:35.007000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:32:35.012231 systemd[1]: Started cri-containerd-ae0b959ae7cc64b4d94c90277f991f4dcef59aee81b320265c738b40eaad124a.scope - libcontainer container ae0b959ae7cc64b4d94c90277f991f4dcef59aee81b320265c738b40eaad124a. Jan 14 01:32:35.021000 audit: BPF prog-id=220 op=LOAD Jan 14 01:32:35.022000 audit: BPF prog-id=221 op=LOAD Jan 14 01:32:35.022000 audit[4806]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=4794 pid=4806 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:35.022000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6165306239353961653763633634623464393463393032373766393931 Jan 14 01:32:35.022000 audit: BPF prog-id=221 op=UNLOAD Jan 14 01:32:35.022000 audit[4806]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4794 pid=4806 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:35.022000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6165306239353961653763633634623464393463393032373766393931 Jan 14 01:32:35.022000 audit: BPF prog-id=222 op=LOAD Jan 14 01:32:35.022000 audit[4806]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=4794 pid=4806 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:35.022000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6165306239353961653763633634623464393463393032373766393931 Jan 14 01:32:35.022000 audit: BPF prog-id=223 op=LOAD Jan 14 01:32:35.022000 audit[4806]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=4794 pid=4806 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:35.022000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6165306239353961653763633634623464393463393032373766393931 Jan 14 01:32:35.022000 audit: BPF prog-id=223 op=UNLOAD Jan 14 01:32:35.022000 audit[4806]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4794 pid=4806 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:35.022000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6165306239353961653763633634623464393463393032373766393931 Jan 14 01:32:35.022000 audit: BPF prog-id=222 op=UNLOAD Jan 14 01:32:35.022000 audit[4806]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4794 pid=4806 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:35.022000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6165306239353961653763633634623464393463393032373766393931 Jan 14 01:32:35.022000 audit: BPF prog-id=224 op=LOAD Jan 14 01:32:35.022000 audit[4806]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=4794 pid=4806 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:35.022000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6165306239353961653763633634623464393463393032373766393931 Jan 14 01:32:35.057030 containerd[1684]: time="2026-01-14T01:32:35.056939069Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-84b7d46b9c-rw586,Uid:7534bfa4-9af8-4640-8306-673448a61bb0,Namespace:calico-system,Attempt:0,} returns sandbox id \"ae0b959ae7cc64b4d94c90277f991f4dcef59aee81b320265c738b40eaad124a\"" Jan 14 01:32:35.172052 containerd[1684]: time="2026-01-14T01:32:35.172012144Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:32:35.173607 containerd[1684]: time="2026-01-14T01:32:35.173579660Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 14 01:32:35.173673 containerd[1684]: time="2026-01-14T01:32:35.173656145Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 14 01:32:35.173886 kubelet[2924]: E0114 01:32:35.173779 2924 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 14 01:32:35.173886 kubelet[2924]: E0114 01:32:35.173830 2924 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 14 01:32:35.174072 kubelet[2924]: E0114 01:32:35.174037 2924 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-545mr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-bfb5m_calico-system(e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 14 01:32:35.174510 containerd[1684]: time="2026-01-14T01:32:35.174398694Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 14 01:32:35.175566 kubelet[2924]: E0114 01:32:35.175527 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-bfb5m" podUID="e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2" Jan 14 01:32:35.300080 systemd-networkd[1567]: cali915ac16338d: Gained IPv6LL Jan 14 01:32:35.364017 systemd-networkd[1567]: cali3cf3d0cfeb0: Gained IPv6LL Jan 14 01:32:35.428393 systemd-networkd[1567]: caliee78a12b5d6: Gained IPv6LL Jan 14 01:32:35.506236 containerd[1684]: time="2026-01-14T01:32:35.506199751Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:32:35.510258 containerd[1684]: time="2026-01-14T01:32:35.510013391Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 14 01:32:35.510258 containerd[1684]: time="2026-01-14T01:32:35.510032921Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 14 01:32:35.510623 kubelet[2924]: E0114 01:32:35.510576 2924 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 14 01:32:35.510874 kubelet[2924]: E0114 01:32:35.510612 2924 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 14 01:32:35.510874 kubelet[2924]: E0114 01:32:35.510795 2924 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l7742,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-84b7d46b9c-rw586_calico-system(7534bfa4-9af8-4640-8306-673448a61bb0): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 14 01:32:35.512366 kubelet[2924]: E0114 01:32:35.512336 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-84b7d46b9c-rw586" podUID="7534bfa4-9af8-4640-8306-673448a61bb0" Jan 14 01:32:35.897000 audit: BPF prog-id=225 op=LOAD Jan 14 01:32:35.897000 audit[4887]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffcf8064320 a2=98 a3=1fffffffffffffff items=0 ppid=4871 pid=4887 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:35.897000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 14 01:32:35.897000 audit: BPF prog-id=225 op=UNLOAD Jan 14 01:32:35.897000 audit[4887]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7ffcf80642f0 a3=0 items=0 ppid=4871 pid=4887 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:35.897000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 14 01:32:35.897000 audit: BPF prog-id=226 op=LOAD Jan 14 01:32:35.897000 audit[4887]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffcf8064200 a2=94 a3=3 items=0 ppid=4871 pid=4887 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:35.897000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 14 01:32:35.897000 audit: BPF prog-id=226 op=UNLOAD Jan 14 01:32:35.897000 audit[4887]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffcf8064200 a2=94 a3=3 items=0 ppid=4871 pid=4887 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:35.897000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 14 01:32:35.898000 audit: BPF prog-id=227 op=LOAD Jan 14 01:32:35.898000 audit[4887]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffcf8064240 a2=94 a3=7ffcf8064420 items=0 ppid=4871 pid=4887 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:35.898000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 14 01:32:35.898000 audit: BPF prog-id=227 op=UNLOAD Jan 14 01:32:35.898000 audit[4887]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffcf8064240 a2=94 a3=7ffcf8064420 items=0 ppid=4871 pid=4887 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:35.898000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 14 01:32:35.899000 audit: BPF prog-id=228 op=LOAD Jan 14 01:32:35.899000 audit[4888]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffc3398bf00 a2=98 a3=3 items=0 ppid=4871 pid=4888 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:35.899000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:32:35.899000 audit: BPF prog-id=228 op=UNLOAD Jan 14 01:32:35.899000 audit[4888]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7ffc3398bed0 a3=0 items=0 ppid=4871 pid=4888 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:35.899000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:32:35.900000 audit: BPF prog-id=229 op=LOAD Jan 14 01:32:35.900000 audit[4888]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffc3398bcf0 a2=94 a3=54428f items=0 ppid=4871 pid=4888 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:35.900000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:32:35.900000 audit: BPF prog-id=229 op=UNLOAD Jan 14 01:32:35.900000 audit[4888]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffc3398bcf0 a2=94 a3=54428f items=0 ppid=4871 pid=4888 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:35.900000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:32:35.900000 audit: BPF prog-id=230 op=LOAD Jan 14 01:32:35.900000 audit[4888]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffc3398bd20 a2=94 a3=2 items=0 ppid=4871 pid=4888 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:35.900000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:32:35.900000 audit: BPF prog-id=230 op=UNLOAD Jan 14 01:32:35.900000 audit[4888]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffc3398bd20 a2=0 a3=2 items=0 ppid=4871 pid=4888 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:35.900000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:32:35.941542 kubelet[2924]: E0114 01:32:35.941297 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-84b7d46b9c-rw586" podUID="7534bfa4-9af8-4640-8306-673448a61bb0" Jan 14 01:32:35.943377 kubelet[2924]: E0114 01:32:35.943295 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-bfb5m" podUID="e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2" Jan 14 01:32:35.943857 kubelet[2924]: E0114 01:32:35.943830 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-gknfk" podUID="9e0350f1-074c-4801-8433-6b63afe081c2" Jan 14 01:32:35.992000 audit[4890]: NETFILTER_CFG table=filter:129 family=2 entries=14 op=nft_register_rule pid=4890 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:32:35.992000 audit[4890]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7fff360a7dc0 a2=0 a3=7fff360a7dac items=0 ppid=3028 pid=4890 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:35.992000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:32:36.016000 audit[4890]: NETFILTER_CFG table=nat:130 family=2 entries=56 op=nft_register_chain pid=4890 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:32:36.016000 audit[4890]: SYSCALL arch=c000003e syscall=46 success=yes exit=19860 a0=3 a1=7fff360a7dc0 a2=0 a3=7fff360a7dac items=0 ppid=3028 pid=4890 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:36.016000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:32:36.090000 audit: BPF prog-id=231 op=LOAD Jan 14 01:32:36.090000 audit[4888]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffc3398bbe0 a2=94 a3=1 items=0 ppid=4871 pid=4888 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:36.090000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:32:36.090000 audit: BPF prog-id=231 op=UNLOAD Jan 14 01:32:36.090000 audit[4888]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffc3398bbe0 a2=94 a3=1 items=0 ppid=4871 pid=4888 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:36.090000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:32:36.100000 audit: BPF prog-id=232 op=LOAD Jan 14 01:32:36.100000 audit[4888]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffc3398bbd0 a2=94 a3=4 items=0 ppid=4871 pid=4888 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:36.100000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:32:36.100000 audit: BPF prog-id=232 op=UNLOAD Jan 14 01:32:36.100000 audit[4888]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7ffc3398bbd0 a2=0 a3=4 items=0 ppid=4871 pid=4888 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:36.100000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:32:36.100000 audit: BPF prog-id=233 op=LOAD Jan 14 01:32:36.100000 audit[4888]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffc3398ba30 a2=94 a3=5 items=0 ppid=4871 pid=4888 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:36.100000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:32:36.100000 audit: BPF prog-id=233 op=UNLOAD Jan 14 01:32:36.100000 audit[4888]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7ffc3398ba30 a2=0 a3=5 items=0 ppid=4871 pid=4888 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:36.100000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:32:36.100000 audit: BPF prog-id=234 op=LOAD Jan 14 01:32:36.100000 audit[4888]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffc3398bc50 a2=94 a3=6 items=0 ppid=4871 pid=4888 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:36.100000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:32:36.101000 audit: BPF prog-id=234 op=UNLOAD Jan 14 01:32:36.101000 audit[4888]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7ffc3398bc50 a2=0 a3=6 items=0 ppid=4871 pid=4888 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:36.101000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:32:36.101000 audit: BPF prog-id=235 op=LOAD Jan 14 01:32:36.101000 audit[4888]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffc3398b400 a2=94 a3=88 items=0 ppid=4871 pid=4888 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:36.101000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:32:36.101000 audit: BPF prog-id=236 op=LOAD Jan 14 01:32:36.101000 audit[4888]: SYSCALL arch=c000003e syscall=321 success=yes exit=7 a0=5 a1=7ffc3398b280 a2=94 a3=2 items=0 ppid=4871 pid=4888 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:36.101000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:32:36.101000 audit: BPF prog-id=236 op=UNLOAD Jan 14 01:32:36.101000 audit[4888]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=7 a1=7ffc3398b2b0 a2=0 a3=7ffc3398b3b0 items=0 ppid=4871 pid=4888 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:36.101000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:32:36.101000 audit: BPF prog-id=235 op=UNLOAD Jan 14 01:32:36.101000 audit[4888]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=77bed10 a2=0 a3=fa53c46bd57147e9 items=0 ppid=4871 pid=4888 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:36.101000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:32:36.113000 audit: BPF prog-id=237 op=LOAD Jan 14 01:32:36.113000 audit[4894]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffdd7148de0 a2=98 a3=1999999999999999 items=0 ppid=4871 pid=4894 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:36.113000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 14 01:32:36.113000 audit: BPF prog-id=237 op=UNLOAD Jan 14 01:32:36.113000 audit[4894]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7ffdd7148db0 a3=0 items=0 ppid=4871 pid=4894 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:36.113000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 14 01:32:36.113000 audit: BPF prog-id=238 op=LOAD Jan 14 01:32:36.113000 audit[4894]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffdd7148cc0 a2=94 a3=ffff items=0 ppid=4871 pid=4894 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:36.113000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 14 01:32:36.113000 audit: BPF prog-id=238 op=UNLOAD Jan 14 01:32:36.113000 audit[4894]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffdd7148cc0 a2=94 a3=ffff items=0 ppid=4871 pid=4894 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:36.113000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 14 01:32:36.113000 audit: BPF prog-id=239 op=LOAD Jan 14 01:32:36.113000 audit[4894]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffdd7148d00 a2=94 a3=7ffdd7148ee0 items=0 ppid=4871 pid=4894 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:36.113000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 14 01:32:36.113000 audit: BPF prog-id=239 op=UNLOAD Jan 14 01:32:36.113000 audit[4894]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffdd7148d00 a2=94 a3=7ffdd7148ee0 items=0 ppid=4871 pid=4894 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:36.113000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 14 01:32:36.175929 systemd-networkd[1567]: vxlan.calico: Link UP Jan 14 01:32:36.175936 systemd-networkd[1567]: vxlan.calico: Gained carrier Jan 14 01:32:36.194000 audit: BPF prog-id=240 op=LOAD Jan 14 01:32:36.194000 audit[4918]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7fff49845310 a2=98 a3=0 items=0 ppid=4871 pid=4918 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:36.194000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 01:32:36.194000 audit: BPF prog-id=240 op=UNLOAD Jan 14 01:32:36.194000 audit[4918]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7fff498452e0 a3=0 items=0 ppid=4871 pid=4918 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:36.194000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 01:32:36.195000 audit: BPF prog-id=241 op=LOAD Jan 14 01:32:36.195000 audit[4918]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7fff49845120 a2=94 a3=54428f items=0 ppid=4871 pid=4918 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:36.195000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 01:32:36.195000 audit: BPF prog-id=241 op=UNLOAD Jan 14 01:32:36.195000 audit[4918]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7fff49845120 a2=94 a3=54428f items=0 ppid=4871 pid=4918 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:36.195000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 01:32:36.195000 audit: BPF prog-id=242 op=LOAD Jan 14 01:32:36.195000 audit[4918]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7fff49845150 a2=94 a3=2 items=0 ppid=4871 pid=4918 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:36.195000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 01:32:36.195000 audit: BPF prog-id=242 op=UNLOAD Jan 14 01:32:36.195000 audit[4918]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7fff49845150 a2=0 a3=2 items=0 ppid=4871 pid=4918 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:36.195000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 01:32:36.195000 audit: BPF prog-id=243 op=LOAD Jan 14 01:32:36.195000 audit[4918]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7fff49844f00 a2=94 a3=4 items=0 ppid=4871 pid=4918 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:36.195000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 01:32:36.195000 audit: BPF prog-id=243 op=UNLOAD Jan 14 01:32:36.195000 audit[4918]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7fff49844f00 a2=94 a3=4 items=0 ppid=4871 pid=4918 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:36.195000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 01:32:36.195000 audit: BPF prog-id=244 op=LOAD Jan 14 01:32:36.195000 audit[4918]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7fff49845000 a2=94 a3=7fff49845180 items=0 ppid=4871 pid=4918 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:36.195000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 01:32:36.195000 audit: BPF prog-id=244 op=UNLOAD Jan 14 01:32:36.195000 audit[4918]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7fff49845000 a2=0 a3=7fff49845180 items=0 ppid=4871 pid=4918 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:36.195000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 01:32:36.195000 audit: BPF prog-id=245 op=LOAD Jan 14 01:32:36.195000 audit[4918]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7fff49844730 a2=94 a3=2 items=0 ppid=4871 pid=4918 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:36.195000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 01:32:36.195000 audit: BPF prog-id=245 op=UNLOAD Jan 14 01:32:36.195000 audit[4918]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7fff49844730 a2=0 a3=2 items=0 ppid=4871 pid=4918 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:36.195000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 01:32:36.195000 audit: BPF prog-id=246 op=LOAD Jan 14 01:32:36.195000 audit[4918]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7fff49844830 a2=94 a3=30 items=0 ppid=4871 pid=4918 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:36.195000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 01:32:36.201000 audit: BPF prog-id=247 op=LOAD Jan 14 01:32:36.201000 audit[4923]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffd4a9a9880 a2=98 a3=0 items=0 ppid=4871 pid=4923 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:36.201000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:32:36.204000 audit: BPF prog-id=247 op=UNLOAD Jan 14 01:32:36.204000 audit[4923]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7ffd4a9a9850 a3=0 items=0 ppid=4871 pid=4923 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:36.204000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:32:36.204000 audit: BPF prog-id=248 op=LOAD Jan 14 01:32:36.204000 audit[4923]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffd4a9a9670 a2=94 a3=54428f items=0 ppid=4871 pid=4923 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:36.204000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:32:36.204000 audit: BPF prog-id=248 op=UNLOAD Jan 14 01:32:36.204000 audit[4923]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffd4a9a9670 a2=94 a3=54428f items=0 ppid=4871 pid=4923 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:36.204000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:32:36.204000 audit: BPF prog-id=249 op=LOAD Jan 14 01:32:36.204000 audit[4923]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffd4a9a96a0 a2=94 a3=2 items=0 ppid=4871 pid=4923 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:36.204000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:32:36.204000 audit: BPF prog-id=249 op=UNLOAD Jan 14 01:32:36.204000 audit[4923]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffd4a9a96a0 a2=0 a3=2 items=0 ppid=4871 pid=4923 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:36.204000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:32:36.386000 audit: BPF prog-id=250 op=LOAD Jan 14 01:32:36.386000 audit[4923]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffd4a9a9560 a2=94 a3=1 items=0 ppid=4871 pid=4923 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:36.386000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:32:36.386000 audit: BPF prog-id=250 op=UNLOAD Jan 14 01:32:36.386000 audit[4923]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffd4a9a9560 a2=94 a3=1 items=0 ppid=4871 pid=4923 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:36.386000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:32:36.396000 audit: BPF prog-id=251 op=LOAD Jan 14 01:32:36.396000 audit[4923]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffd4a9a9550 a2=94 a3=4 items=0 ppid=4871 pid=4923 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:36.396000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:32:36.396000 audit: BPF prog-id=251 op=UNLOAD Jan 14 01:32:36.396000 audit[4923]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7ffd4a9a9550 a2=0 a3=4 items=0 ppid=4871 pid=4923 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:36.396000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:32:36.397000 audit: BPF prog-id=252 op=LOAD Jan 14 01:32:36.397000 audit[4923]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffd4a9a93b0 a2=94 a3=5 items=0 ppid=4871 pid=4923 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:36.397000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:32:36.397000 audit: BPF prog-id=252 op=UNLOAD Jan 14 01:32:36.397000 audit[4923]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7ffd4a9a93b0 a2=0 a3=5 items=0 ppid=4871 pid=4923 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:36.397000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:32:36.397000 audit: BPF prog-id=253 op=LOAD Jan 14 01:32:36.397000 audit[4923]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffd4a9a95d0 a2=94 a3=6 items=0 ppid=4871 pid=4923 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:36.397000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:32:36.397000 audit: BPF prog-id=253 op=UNLOAD Jan 14 01:32:36.397000 audit[4923]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7ffd4a9a95d0 a2=0 a3=6 items=0 ppid=4871 pid=4923 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:36.397000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:32:36.397000 audit: BPF prog-id=254 op=LOAD Jan 14 01:32:36.397000 audit[4923]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffd4a9a8d80 a2=94 a3=88 items=0 ppid=4871 pid=4923 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:36.397000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:32:36.397000 audit: BPF prog-id=255 op=LOAD Jan 14 01:32:36.397000 audit[4923]: SYSCALL arch=c000003e syscall=321 success=yes exit=7 a0=5 a1=7ffd4a9a8c00 a2=94 a3=2 items=0 ppid=4871 pid=4923 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:36.397000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:32:36.397000 audit: BPF prog-id=255 op=UNLOAD Jan 14 01:32:36.397000 audit[4923]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=7 a1=7ffd4a9a8c30 a2=0 a3=7ffd4a9a8d30 items=0 ppid=4871 pid=4923 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:36.397000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:32:36.398000 audit: BPF prog-id=254 op=UNLOAD Jan 14 01:32:36.398000 audit[4923]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=37b0cd10 a2=0 a3=ffe5240fceee7178 items=0 ppid=4871 pid=4923 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:36.398000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:32:36.404000 audit: BPF prog-id=246 op=UNLOAD Jan 14 01:32:36.404000 audit[4871]: SYSCALL arch=c000003e syscall=263 success=yes exit=0 a0=ffffffffffffff9c a1=c000c03640 a2=0 a3=0 items=0 ppid=4015 pid=4871 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="calico-node" exe="/usr/bin/calico-node" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:36.404000 audit: PROCTITLE proctitle=63616C69636F2D6E6F6465002D66656C6978 Jan 14 01:32:36.472000 audit[4960]: NETFILTER_CFG table=mangle:131 family=2 entries=16 op=nft_register_chain pid=4960 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 14 01:32:36.472000 audit[4960]: SYSCALL arch=c000003e syscall=46 success=yes exit=6868 a0=3 a1=7ffd5a36d990 a2=0 a3=7ffd5a36d97c items=0 ppid=4871 pid=4960 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:36.472000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 14 01:32:36.474000 audit[4961]: NETFILTER_CFG table=nat:132 family=2 entries=15 op=nft_register_chain pid=4961 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 14 01:32:36.474000 audit[4961]: SYSCALL arch=c000003e syscall=46 success=yes exit=5084 a0=3 a1=7ffd5cc75d90 a2=0 a3=7ffd5cc75d7c items=0 ppid=4871 pid=4961 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:36.474000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 14 01:32:36.475000 audit[4956]: NETFILTER_CFG table=raw:133 family=2 entries=21 op=nft_register_chain pid=4956 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 14 01:32:36.475000 audit[4956]: SYSCALL arch=c000003e syscall=46 success=yes exit=8452 a0=3 a1=7ffdcdb2c6f0 a2=0 a3=5591e9f4f000 items=0 ppid=4871 pid=4956 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:36.475000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 14 01:32:36.485000 audit[4958]: NETFILTER_CFG table=filter:134 family=2 entries=321 op=nft_register_chain pid=4958 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 14 01:32:36.485000 audit[4958]: SYSCALL arch=c000003e syscall=46 success=yes exit=190616 a0=3 a1=7fffdc75ae70 a2=0 a3=7fffdc75ae5c items=0 ppid=4871 pid=4958 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:32:36.485000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 14 01:32:36.836061 systemd-networkd[1567]: cali1cc7bd4d3c1: Gained IPv6LL Jan 14 01:32:36.943172 kubelet[2924]: E0114 01:32:36.943140 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-84b7d46b9c-rw586" podUID="7534bfa4-9af8-4640-8306-673448a61bb0" Jan 14 01:32:37.860094 systemd-networkd[1567]: vxlan.calico: Gained IPv6LL Jan 14 01:32:44.772317 containerd[1684]: time="2026-01-14T01:32:44.771975996Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 14 01:32:45.102518 containerd[1684]: time="2026-01-14T01:32:45.102339263Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:32:45.104726 containerd[1684]: time="2026-01-14T01:32:45.104651309Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 14 01:32:45.104926 containerd[1684]: time="2026-01-14T01:32:45.104701257Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 14 01:32:45.105233 kubelet[2924]: E0114 01:32:45.105134 2924 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:32:45.105233 kubelet[2924]: E0114 01:32:45.105211 2924 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:32:45.106148 kubelet[2924]: E0114 01:32:45.105926 2924 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5jdx6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-5f6c79b459-qjrqf_calico-apiserver(a4130f46-1c6e-474c-9a1c-5fd1820934c0): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 14 01:32:45.106699 containerd[1684]: time="2026-01-14T01:32:45.106530782Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 14 01:32:45.107504 kubelet[2924]: E0114 01:32:45.107356 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-qjrqf" podUID="a4130f46-1c6e-474c-9a1c-5fd1820934c0" Jan 14 01:32:45.453879 containerd[1684]: time="2026-01-14T01:32:45.453625373Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:32:45.455424 containerd[1684]: time="2026-01-14T01:32:45.455310415Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 14 01:32:45.455424 containerd[1684]: time="2026-01-14T01:32:45.455392589Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 14 01:32:45.455701 kubelet[2924]: E0114 01:32:45.455614 2924 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 14 01:32:45.455858 kubelet[2924]: E0114 01:32:45.455800 2924 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 14 01:32:45.456027 kubelet[2924]: E0114 01:32:45.455994 2924 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:a26f849dc23e45ddb5523da6bae47563,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fqs2g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-fcdf56664-kjljc_calico-system(5694b728-96e4-405e-ad55-bbb10255a07e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 14 01:32:45.458820 containerd[1684]: time="2026-01-14T01:32:45.458757821Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 14 01:32:45.797727 containerd[1684]: time="2026-01-14T01:32:45.797671642Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:32:45.800404 containerd[1684]: time="2026-01-14T01:32:45.800350507Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 14 01:32:45.800490 containerd[1684]: time="2026-01-14T01:32:45.800436680Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 14 01:32:45.800906 kubelet[2924]: E0114 01:32:45.800601 2924 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 14 01:32:45.800906 kubelet[2924]: E0114 01:32:45.800646 2924 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 14 01:32:45.800906 kubelet[2924]: E0114 01:32:45.800741 2924 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fqs2g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-fcdf56664-kjljc_calico-system(5694b728-96e4-405e-ad55-bbb10255a07e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 14 01:32:45.801880 kubelet[2924]: E0114 01:32:45.801810 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-fcdf56664-kjljc" podUID="5694b728-96e4-405e-ad55-bbb10255a07e" Jan 14 01:32:48.772550 containerd[1684]: time="2026-01-14T01:32:48.772147193Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 14 01:32:49.117593 containerd[1684]: time="2026-01-14T01:32:49.117201160Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:32:49.120710 containerd[1684]: time="2026-01-14T01:32:49.120516701Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 14 01:32:49.120710 containerd[1684]: time="2026-01-14T01:32:49.120628224Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 14 01:32:49.121201 kubelet[2924]: E0114 01:32:49.120939 2924 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:32:49.121201 kubelet[2924]: E0114 01:32:49.121021 2924 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:32:49.122314 kubelet[2924]: E0114 01:32:49.121274 2924 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fmnkd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-5f6c79b459-7rtbs_calico-apiserver(327b5d4f-3ff4-43f6-98c0-8969f3f3f2d2): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 14 01:32:49.123707 kubelet[2924]: E0114 01:32:49.123535 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-7rtbs" podUID="327b5d4f-3ff4-43f6-98c0-8969f3f3f2d2" Jan 14 01:32:49.771691 containerd[1684]: time="2026-01-14T01:32:49.771030103Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 14 01:32:50.106185 containerd[1684]: time="2026-01-14T01:32:50.106055491Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:32:50.107841 containerd[1684]: time="2026-01-14T01:32:50.107803781Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 14 01:32:50.107926 containerd[1684]: time="2026-01-14T01:32:50.107902268Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 14 01:32:50.108306 kubelet[2924]: E0114 01:32:50.108057 2924 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 14 01:32:50.108306 kubelet[2924]: E0114 01:32:50.108097 2924 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 14 01:32:50.108306 kubelet[2924]: E0114 01:32:50.108247 2924 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pd8ps,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-gknfk_calico-system(9e0350f1-074c-4801-8433-6b63afe081c2): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 14 01:32:50.110040 kubelet[2924]: E0114 01:32:50.110008 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-gknfk" podUID="9e0350f1-074c-4801-8433-6b63afe081c2" Jan 14 01:32:50.773883 containerd[1684]: time="2026-01-14T01:32:50.773272167Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 14 01:32:51.114798 containerd[1684]: time="2026-01-14T01:32:51.114568507Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:32:51.118063 containerd[1684]: time="2026-01-14T01:32:51.117977872Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 14 01:32:51.118245 containerd[1684]: time="2026-01-14T01:32:51.118112010Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 14 01:32:51.118495 kubelet[2924]: E0114 01:32:51.118364 2924 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 14 01:32:51.119898 kubelet[2924]: E0114 01:32:51.118504 2924 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 14 01:32:51.119898 kubelet[2924]: E0114 01:32:51.118820 2924 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l7742,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-84b7d46b9c-rw586_calico-system(7534bfa4-9af8-4640-8306-673448a61bb0): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 14 01:32:51.120702 kubelet[2924]: E0114 01:32:51.120587 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-84b7d46b9c-rw586" podUID="7534bfa4-9af8-4640-8306-673448a61bb0" Jan 14 01:32:51.772252 containerd[1684]: time="2026-01-14T01:32:51.771935129Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 14 01:32:52.100286 containerd[1684]: time="2026-01-14T01:32:52.100072565Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:32:52.102563 containerd[1684]: time="2026-01-14T01:32:52.102473467Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 14 01:32:52.102675 containerd[1684]: time="2026-01-14T01:32:52.102596416Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 14 01:32:52.103053 kubelet[2924]: E0114 01:32:52.102987 2924 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 14 01:32:52.103135 kubelet[2924]: E0114 01:32:52.103078 2924 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 14 01:32:52.104152 kubelet[2924]: E0114 01:32:52.104037 2924 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-545mr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-bfb5m_calico-system(e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 14 01:32:52.107012 containerd[1684]: time="2026-01-14T01:32:52.106949505Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 14 01:32:52.427581 containerd[1684]: time="2026-01-14T01:32:52.427404609Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:32:52.430935 containerd[1684]: time="2026-01-14T01:32:52.430753370Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 14 01:32:52.430935 containerd[1684]: time="2026-01-14T01:32:52.430827395Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 14 01:32:52.431378 kubelet[2924]: E0114 01:32:52.431310 2924 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 14 01:32:52.431978 kubelet[2924]: E0114 01:32:52.431929 2924 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 14 01:32:52.432129 kubelet[2924]: E0114 01:32:52.432083 2924 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-545mr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-bfb5m_calico-system(e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 14 01:32:52.433702 kubelet[2924]: E0114 01:32:52.433623 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-bfb5m" podUID="e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2" Jan 14 01:32:56.771560 kubelet[2924]: E0114 01:32:56.770230 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-qjrqf" podUID="a4130f46-1c6e-474c-9a1c-5fd1820934c0" Jan 14 01:33:00.772476 kubelet[2924]: E0114 01:33:00.772286 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-7rtbs" podUID="327b5d4f-3ff4-43f6-98c0-8969f3f3f2d2" Jan 14 01:33:00.777014 kubelet[2924]: E0114 01:33:00.776882 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-fcdf56664-kjljc" podUID="5694b728-96e4-405e-ad55-bbb10255a07e" Jan 14 01:33:02.771865 kubelet[2924]: E0114 01:33:02.771297 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-84b7d46b9c-rw586" podUID="7534bfa4-9af8-4640-8306-673448a61bb0" Jan 14 01:33:05.769288 kubelet[2924]: E0114 01:33:05.769188 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-bfb5m" podUID="e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2" Jan 14 01:33:05.772403 kubelet[2924]: E0114 01:33:05.772338 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-gknfk" podUID="9e0350f1-074c-4801-8433-6b63afe081c2" Jan 14 01:33:07.770418 containerd[1684]: time="2026-01-14T01:33:07.770342505Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 14 01:33:08.117862 containerd[1684]: time="2026-01-14T01:33:08.116831864Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:33:08.119010 containerd[1684]: time="2026-01-14T01:33:08.118919219Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 14 01:33:08.119140 containerd[1684]: time="2026-01-14T01:33:08.119106735Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 14 01:33:08.119760 kubelet[2924]: E0114 01:33:08.119336 2924 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:33:08.119760 kubelet[2924]: E0114 01:33:08.119384 2924 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:33:08.119760 kubelet[2924]: E0114 01:33:08.119492 2924 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5jdx6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-5f6c79b459-qjrqf_calico-apiserver(a4130f46-1c6e-474c-9a1c-5fd1820934c0): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 14 01:33:08.121009 kubelet[2924]: E0114 01:33:08.120978 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-qjrqf" podUID="a4130f46-1c6e-474c-9a1c-5fd1820934c0" Jan 14 01:33:13.770083 containerd[1684]: time="2026-01-14T01:33:13.769892110Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 14 01:33:14.109154 containerd[1684]: time="2026-01-14T01:33:14.109021142Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:33:14.111836 containerd[1684]: time="2026-01-14T01:33:14.111789297Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 14 01:33:14.111836 containerd[1684]: time="2026-01-14T01:33:14.111867734Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 14 01:33:14.112248 kubelet[2924]: E0114 01:33:14.112141 2924 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:33:14.112248 kubelet[2924]: E0114 01:33:14.112191 2924 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:33:14.112675 kubelet[2924]: E0114 01:33:14.112632 2924 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fmnkd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-5f6c79b459-7rtbs_calico-apiserver(327b5d4f-3ff4-43f6-98c0-8969f3f3f2d2): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 14 01:33:14.113821 kubelet[2924]: E0114 01:33:14.113792 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-7rtbs" podUID="327b5d4f-3ff4-43f6-98c0-8969f3f3f2d2" Jan 14 01:33:15.771081 containerd[1684]: time="2026-01-14T01:33:15.769688270Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 14 01:33:16.112953 containerd[1684]: time="2026-01-14T01:33:16.112372312Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:33:16.114294 containerd[1684]: time="2026-01-14T01:33:16.114036003Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 14 01:33:16.114294 containerd[1684]: time="2026-01-14T01:33:16.114072651Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 14 01:33:16.114390 kubelet[2924]: E0114 01:33:16.114310 2924 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 14 01:33:16.114390 kubelet[2924]: E0114 01:33:16.114350 2924 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 14 01:33:16.114641 kubelet[2924]: E0114 01:33:16.114514 2924 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:a26f849dc23e45ddb5523da6bae47563,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fqs2g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-fcdf56664-kjljc_calico-system(5694b728-96e4-405e-ad55-bbb10255a07e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 14 01:33:16.115269 containerd[1684]: time="2026-01-14T01:33:16.115248030Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 14 01:33:16.636915 containerd[1684]: time="2026-01-14T01:33:16.636867376Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:33:16.638579 containerd[1684]: time="2026-01-14T01:33:16.638542738Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 14 01:33:16.638652 containerd[1684]: time="2026-01-14T01:33:16.638615939Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 14 01:33:16.638812 kubelet[2924]: E0114 01:33:16.638756 2924 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 14 01:33:16.638881 kubelet[2924]: E0114 01:33:16.638815 2924 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 14 01:33:16.639105 kubelet[2924]: E0114 01:33:16.639055 2924 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l7742,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-84b7d46b9c-rw586_calico-system(7534bfa4-9af8-4640-8306-673448a61bb0): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 14 01:33:16.639427 containerd[1684]: time="2026-01-14T01:33:16.639406867Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 14 01:33:16.640762 kubelet[2924]: E0114 01:33:16.640728 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-84b7d46b9c-rw586" podUID="7534bfa4-9af8-4640-8306-673448a61bb0" Jan 14 01:33:16.978906 containerd[1684]: time="2026-01-14T01:33:16.978531343Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:33:16.980282 containerd[1684]: time="2026-01-14T01:33:16.980247286Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 14 01:33:16.980374 containerd[1684]: time="2026-01-14T01:33:16.980316035Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 14 01:33:16.980566 kubelet[2924]: E0114 01:33:16.980522 2924 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 14 01:33:16.980629 kubelet[2924]: E0114 01:33:16.980565 2924 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 14 01:33:16.980831 kubelet[2924]: E0114 01:33:16.980790 2924 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fqs2g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-fcdf56664-kjljc_calico-system(5694b728-96e4-405e-ad55-bbb10255a07e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 14 01:33:16.981978 containerd[1684]: time="2026-01-14T01:33:16.981914169Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 14 01:33:16.982158 kubelet[2924]: E0114 01:33:16.982130 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-fcdf56664-kjljc" podUID="5694b728-96e4-405e-ad55-bbb10255a07e" Jan 14 01:33:17.302787 containerd[1684]: time="2026-01-14T01:33:17.302730500Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:33:17.304805 containerd[1684]: time="2026-01-14T01:33:17.304710484Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 14 01:33:17.304805 containerd[1684]: time="2026-01-14T01:33:17.304780312Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 14 01:33:17.304965 kubelet[2924]: E0114 01:33:17.304921 2924 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 14 01:33:17.306166 kubelet[2924]: E0114 01:33:17.304972 2924 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 14 01:33:17.306166 kubelet[2924]: E0114 01:33:17.305108 2924 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pd8ps,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-gknfk_calico-system(9e0350f1-074c-4801-8433-6b63afe081c2): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 14 01:33:17.306354 kubelet[2924]: E0114 01:33:17.306319 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-gknfk" podUID="9e0350f1-074c-4801-8433-6b63afe081c2" Jan 14 01:33:17.770299 containerd[1684]: time="2026-01-14T01:33:17.769924467Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 14 01:33:18.084526 containerd[1684]: time="2026-01-14T01:33:18.084359410Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:33:18.087335 containerd[1684]: time="2026-01-14T01:33:18.087233407Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 14 01:33:18.087335 containerd[1684]: time="2026-01-14T01:33:18.087311494Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 14 01:33:18.087482 kubelet[2924]: E0114 01:33:18.087448 2924 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 14 01:33:18.087530 kubelet[2924]: E0114 01:33:18.087504 2924 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 14 01:33:18.088048 kubelet[2924]: E0114 01:33:18.087929 2924 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-545mr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-bfb5m_calico-system(e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 14 01:33:18.090407 containerd[1684]: time="2026-01-14T01:33:18.090389795Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 14 01:33:18.439231 containerd[1684]: time="2026-01-14T01:33:18.438608459Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:33:18.440586 containerd[1684]: time="2026-01-14T01:33:18.440466050Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 14 01:33:18.440586 containerd[1684]: time="2026-01-14T01:33:18.440496721Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 14 01:33:18.440809 kubelet[2924]: E0114 01:33:18.440729 2924 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 14 01:33:18.441197 kubelet[2924]: E0114 01:33:18.440814 2924 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 14 01:33:18.441197 kubelet[2924]: E0114 01:33:18.440956 2924 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-545mr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-bfb5m_calico-system(e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 14 01:33:18.442512 kubelet[2924]: E0114 01:33:18.442455 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-bfb5m" podUID="e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2" Jan 14 01:33:18.771089 kubelet[2924]: E0114 01:33:18.770609 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-qjrqf" podUID="a4130f46-1c6e-474c-9a1c-5fd1820934c0" Jan 14 01:33:28.770319 kubelet[2924]: E0114 01:33:28.770048 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-7rtbs" podUID="327b5d4f-3ff4-43f6-98c0-8969f3f3f2d2" Jan 14 01:33:28.770737 kubelet[2924]: E0114 01:33:28.770529 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-gknfk" podUID="9e0350f1-074c-4801-8433-6b63afe081c2" Jan 14 01:33:29.775159 kubelet[2924]: E0114 01:33:29.775097 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-fcdf56664-kjljc" podUID="5694b728-96e4-405e-ad55-bbb10255a07e" Jan 14 01:33:29.776451 kubelet[2924]: E0114 01:33:29.775387 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-bfb5m" podUID="e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2" Jan 14 01:33:30.771841 kubelet[2924]: E0114 01:33:30.771797 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-qjrqf" podUID="a4130f46-1c6e-474c-9a1c-5fd1820934c0" Jan 14 01:33:30.773000 kubelet[2924]: E0114 01:33:30.772976 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-84b7d46b9c-rw586" podUID="7534bfa4-9af8-4640-8306-673448a61bb0" Jan 14 01:33:39.771106 kubelet[2924]: E0114 01:33:39.771060 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-gknfk" podUID="9e0350f1-074c-4801-8433-6b63afe081c2" Jan 14 01:33:40.772568 kubelet[2924]: E0114 01:33:40.771905 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-7rtbs" podUID="327b5d4f-3ff4-43f6-98c0-8969f3f3f2d2" Jan 14 01:33:40.772568 kubelet[2924]: E0114 01:33:40.772510 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-bfb5m" podUID="e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2" Jan 14 01:33:42.770764 kubelet[2924]: E0114 01:33:42.770730 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-84b7d46b9c-rw586" podUID="7534bfa4-9af8-4640-8306-673448a61bb0" Jan 14 01:33:43.769828 kubelet[2924]: E0114 01:33:43.769691 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-qjrqf" podUID="a4130f46-1c6e-474c-9a1c-5fd1820934c0" Jan 14 01:33:43.770615 kubelet[2924]: E0114 01:33:43.770486 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-fcdf56664-kjljc" podUID="5694b728-96e4-405e-ad55-bbb10255a07e" Jan 14 01:33:51.769333 kubelet[2924]: E0114 01:33:51.769040 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-gknfk" podUID="9e0350f1-074c-4801-8433-6b63afe081c2" Jan 14 01:33:51.769333 kubelet[2924]: E0114 01:33:51.769117 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-7rtbs" podUID="327b5d4f-3ff4-43f6-98c0-8969f3f3f2d2" Jan 14 01:33:53.770871 kubelet[2924]: E0114 01:33:53.769714 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-84b7d46b9c-rw586" podUID="7534bfa4-9af8-4640-8306-673448a61bb0" Jan 14 01:33:53.774127 kubelet[2924]: E0114 01:33:53.774087 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-bfb5m" podUID="e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2" Jan 14 01:33:57.769378 containerd[1684]: time="2026-01-14T01:33:57.769090612Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 14 01:33:58.101817 containerd[1684]: time="2026-01-14T01:33:58.101385018Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:33:58.103399 containerd[1684]: time="2026-01-14T01:33:58.103339336Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 14 01:33:58.103519 containerd[1684]: time="2026-01-14T01:33:58.103490031Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 14 01:33:58.103834 kubelet[2924]: E0114 01:33:58.103778 2924 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 14 01:33:58.105872 kubelet[2924]: E0114 01:33:58.104490 2924 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 14 01:33:58.106029 containerd[1684]: time="2026-01-14T01:33:58.105096689Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 14 01:33:58.106237 kubelet[2924]: E0114 01:33:58.106186 2924 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:a26f849dc23e45ddb5523da6bae47563,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fqs2g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-fcdf56664-kjljc_calico-system(5694b728-96e4-405e-ad55-bbb10255a07e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 14 01:33:58.455879 containerd[1684]: time="2026-01-14T01:33:58.455306965Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:33:58.458228 containerd[1684]: time="2026-01-14T01:33:58.458166432Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 14 01:33:58.459227 containerd[1684]: time="2026-01-14T01:33:58.458291728Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 14 01:33:58.459227 containerd[1684]: time="2026-01-14T01:33:58.459105485Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 14 01:33:58.459320 kubelet[2924]: E0114 01:33:58.458568 2924 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:33:58.459320 kubelet[2924]: E0114 01:33:58.458616 2924 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:33:58.460023 kubelet[2924]: E0114 01:33:58.459620 2924 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5jdx6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-5f6c79b459-qjrqf_calico-apiserver(a4130f46-1c6e-474c-9a1c-5fd1820934c0): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 14 01:33:58.461867 kubelet[2924]: E0114 01:33:58.461320 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-qjrqf" podUID="a4130f46-1c6e-474c-9a1c-5fd1820934c0" Jan 14 01:33:58.791504 containerd[1684]: time="2026-01-14T01:33:58.791456482Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:33:58.795645 containerd[1684]: time="2026-01-14T01:33:58.795551839Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 14 01:33:58.795645 containerd[1684]: time="2026-01-14T01:33:58.795607326Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 14 01:33:58.797036 kubelet[2924]: E0114 01:33:58.796985 2924 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 14 01:33:58.797099 kubelet[2924]: E0114 01:33:58.797043 2924 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 14 01:33:58.797235 kubelet[2924]: E0114 01:33:58.797190 2924 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fqs2g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-fcdf56664-kjljc_calico-system(5694b728-96e4-405e-ad55-bbb10255a07e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 14 01:33:58.798852 kubelet[2924]: E0114 01:33:58.798789 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-fcdf56664-kjljc" podUID="5694b728-96e4-405e-ad55-bbb10255a07e" Jan 14 01:34:03.769538 containerd[1684]: time="2026-01-14T01:34:03.769416513Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 14 01:34:04.120434 containerd[1684]: time="2026-01-14T01:34:04.120038215Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:34:04.122420 containerd[1684]: time="2026-01-14T01:34:04.122321278Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 14 01:34:04.122420 containerd[1684]: time="2026-01-14T01:34:04.122397335Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 14 01:34:04.122939 kubelet[2924]: E0114 01:34:04.122664 2924 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 14 01:34:04.122939 kubelet[2924]: E0114 01:34:04.122900 2924 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 14 01:34:04.123956 kubelet[2924]: E0114 01:34:04.123891 2924 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pd8ps,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-gknfk_calico-system(9e0350f1-074c-4801-8433-6b63afe081c2): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 14 01:34:04.125100 kubelet[2924]: E0114 01:34:04.125074 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-gknfk" podUID="9e0350f1-074c-4801-8433-6b63afe081c2" Jan 14 01:34:05.769548 containerd[1684]: time="2026-01-14T01:34:05.769496624Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 14 01:34:06.090385 containerd[1684]: time="2026-01-14T01:34:06.090131269Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:34:06.092129 containerd[1684]: time="2026-01-14T01:34:06.092099695Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 14 01:34:06.092229 containerd[1684]: time="2026-01-14T01:34:06.092181970Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 14 01:34:06.092593 kubelet[2924]: E0114 01:34:06.092543 2924 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:34:06.092894 kubelet[2924]: E0114 01:34:06.092610 2924 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:34:06.093552 kubelet[2924]: E0114 01:34:06.093502 2924 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fmnkd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-5f6c79b459-7rtbs_calico-apiserver(327b5d4f-3ff4-43f6-98c0-8969f3f3f2d2): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 14 01:34:06.094703 kubelet[2924]: E0114 01:34:06.094679 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-7rtbs" podUID="327b5d4f-3ff4-43f6-98c0-8969f3f3f2d2" Jan 14 01:34:06.770488 containerd[1684]: time="2026-01-14T01:34:06.770341357Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 14 01:34:07.095140 containerd[1684]: time="2026-01-14T01:34:07.095022290Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:34:07.097240 containerd[1684]: time="2026-01-14T01:34:07.097196092Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 14 01:34:07.097329 containerd[1684]: time="2026-01-14T01:34:07.097296236Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 14 01:34:07.097453 kubelet[2924]: E0114 01:34:07.097418 2924 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 14 01:34:07.097696 kubelet[2924]: E0114 01:34:07.097469 2924 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 14 01:34:07.097696 kubelet[2924]: E0114 01:34:07.097591 2924 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-545mr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-bfb5m_calico-system(e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 14 01:34:07.101091 containerd[1684]: time="2026-01-14T01:34:07.101062416Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 14 01:34:07.449210 containerd[1684]: time="2026-01-14T01:34:07.448797997Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:34:07.450852 containerd[1684]: time="2026-01-14T01:34:07.450816990Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 14 01:34:07.451010 containerd[1684]: time="2026-01-14T01:34:07.450903474Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 14 01:34:07.451077 kubelet[2924]: E0114 01:34:07.451045 2924 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 14 01:34:07.451122 kubelet[2924]: E0114 01:34:07.451088 2924 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 14 01:34:07.451227 kubelet[2924]: E0114 01:34:07.451197 2924 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-545mr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-bfb5m_calico-system(e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 14 01:34:07.452515 kubelet[2924]: E0114 01:34:07.452486 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-bfb5m" podUID="e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2" Jan 14 01:34:07.770073 containerd[1684]: time="2026-01-14T01:34:07.770006000Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 14 01:34:08.111062 containerd[1684]: time="2026-01-14T01:34:08.110956057Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:34:08.112862 containerd[1684]: time="2026-01-14T01:34:08.112819862Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 14 01:34:08.112930 containerd[1684]: time="2026-01-14T01:34:08.112903295Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 14 01:34:08.114045 kubelet[2924]: E0114 01:34:08.113051 2924 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 14 01:34:08.114045 kubelet[2924]: E0114 01:34:08.113091 2924 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 14 01:34:08.114045 kubelet[2924]: E0114 01:34:08.113200 2924 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l7742,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-84b7d46b9c-rw586_calico-system(7534bfa4-9af8-4640-8306-673448a61bb0): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 14 01:34:08.114523 kubelet[2924]: E0114 01:34:08.114484 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-84b7d46b9c-rw586" podUID="7534bfa4-9af8-4640-8306-673448a61bb0" Jan 14 01:34:11.770215 kubelet[2924]: E0114 01:34:11.770170 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-qjrqf" podUID="a4130f46-1c6e-474c-9a1c-5fd1820934c0" Jan 14 01:34:12.771949 kubelet[2924]: E0114 01:34:12.771409 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-fcdf56664-kjljc" podUID="5694b728-96e4-405e-ad55-bbb10255a07e" Jan 14 01:34:15.770114 kubelet[2924]: E0114 01:34:15.770030 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-gknfk" podUID="9e0350f1-074c-4801-8433-6b63afe081c2" Jan 14 01:34:16.445040 update_engine[1653]: I20260114 01:34:16.444979 1653 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Jan 14 01:34:16.445040 update_engine[1653]: I20260114 01:34:16.445032 1653 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Jan 14 01:34:16.445975 update_engine[1653]: I20260114 01:34:16.445228 1653 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Jan 14 01:34:16.446516 update_engine[1653]: I20260114 01:34:16.446492 1653 omaha_request_params.cc:62] Current group set to developer Jan 14 01:34:16.449051 update_engine[1653]: I20260114 01:34:16.447559 1653 update_attempter.cc:499] Already updated boot flags. Skipping. Jan 14 01:34:16.449051 update_engine[1653]: I20260114 01:34:16.447579 1653 update_attempter.cc:643] Scheduling an action processor start. Jan 14 01:34:16.449051 update_engine[1653]: I20260114 01:34:16.447597 1653 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Jan 14 01:34:16.449381 locksmithd[1700]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Jan 14 01:34:16.454674 update_engine[1653]: I20260114 01:34:16.454641 1653 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Jan 14 01:34:16.454748 update_engine[1653]: I20260114 01:34:16.454733 1653 omaha_request_action.cc:271] Posting an Omaha request to disabled Jan 14 01:34:16.454748 update_engine[1653]: I20260114 01:34:16.454741 1653 omaha_request_action.cc:272] Request: Jan 14 01:34:16.454748 update_engine[1653]: Jan 14 01:34:16.454748 update_engine[1653]: Jan 14 01:34:16.454748 update_engine[1653]: Jan 14 01:34:16.454748 update_engine[1653]: Jan 14 01:34:16.454748 update_engine[1653]: Jan 14 01:34:16.454748 update_engine[1653]: Jan 14 01:34:16.454748 update_engine[1653]: Jan 14 01:34:16.454748 update_engine[1653]: Jan 14 01:34:16.454945 update_engine[1653]: I20260114 01:34:16.454747 1653 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 14 01:34:16.458734 update_engine[1653]: I20260114 01:34:16.458700 1653 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 14 01:34:16.459210 update_engine[1653]: I20260114 01:34:16.459187 1653 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 14 01:34:16.468859 update_engine[1653]: E20260114 01:34:16.468039 1653 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Jan 14 01:34:16.468859 update_engine[1653]: I20260114 01:34:16.468122 1653 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Jan 14 01:34:19.771412 kubelet[2924]: E0114 01:34:19.771377 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-bfb5m" podUID="e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2" Jan 14 01:34:20.770884 kubelet[2924]: E0114 01:34:20.770433 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-84b7d46b9c-rw586" podUID="7534bfa4-9af8-4640-8306-673448a61bb0" Jan 14 01:34:20.772604 kubelet[2924]: E0114 01:34:20.771391 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-7rtbs" podUID="327b5d4f-3ff4-43f6-98c0-8969f3f3f2d2" Jan 14 01:34:23.769778 kubelet[2924]: E0114 01:34:23.769710 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-fcdf56664-kjljc" podUID="5694b728-96e4-405e-ad55-bbb10255a07e" Jan 14 01:34:24.772177 kubelet[2924]: E0114 01:34:24.772135 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-qjrqf" podUID="a4130f46-1c6e-474c-9a1c-5fd1820934c0" Jan 14 01:34:26.351399 update_engine[1653]: I20260114 01:34:26.351338 1653 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 14 01:34:26.351802 update_engine[1653]: I20260114 01:34:26.351423 1653 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 14 01:34:26.351802 update_engine[1653]: I20260114 01:34:26.351715 1653 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 14 01:34:26.358333 update_engine[1653]: E20260114 01:34:26.358283 1653 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Jan 14 01:34:26.358444 update_engine[1653]: I20260114 01:34:26.358373 1653 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Jan 14 01:34:29.769458 kubelet[2924]: E0114 01:34:29.769379 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-gknfk" podUID="9e0350f1-074c-4801-8433-6b63afe081c2" Jan 14 01:34:32.770882 kubelet[2924]: E0114 01:34:32.770459 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-bfb5m" podUID="e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2" Jan 14 01:34:34.770341 kubelet[2924]: E0114 01:34:34.770204 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-84b7d46b9c-rw586" podUID="7534bfa4-9af8-4640-8306-673448a61bb0" Jan 14 01:34:35.769518 kubelet[2924]: E0114 01:34:35.769480 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-7rtbs" podUID="327b5d4f-3ff4-43f6-98c0-8969f3f3f2d2" Jan 14 01:34:36.354989 update_engine[1653]: I20260114 01:34:36.354804 1653 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 14 01:34:36.354989 update_engine[1653]: I20260114 01:34:36.354899 1653 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 14 01:34:36.355350 update_engine[1653]: I20260114 01:34:36.355191 1653 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 14 01:34:36.360954 update_engine[1653]: E20260114 01:34:36.360908 1653 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Jan 14 01:34:36.361154 update_engine[1653]: I20260114 01:34:36.360993 1653 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Jan 14 01:34:36.770814 kubelet[2924]: E0114 01:34:36.770462 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-qjrqf" podUID="a4130f46-1c6e-474c-9a1c-5fd1820934c0" Jan 14 01:34:37.770679 kubelet[2924]: E0114 01:34:37.770639 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-fcdf56664-kjljc" podUID="5694b728-96e4-405e-ad55-bbb10255a07e" Jan 14 01:34:40.773420 kubelet[2924]: E0114 01:34:40.773382 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-gknfk" podUID="9e0350f1-074c-4801-8433-6b63afe081c2" Jan 14 01:34:45.771430 kubelet[2924]: E0114 01:34:45.771395 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-84b7d46b9c-rw586" podUID="7534bfa4-9af8-4640-8306-673448a61bb0" Jan 14 01:34:45.771936 kubelet[2924]: E0114 01:34:45.771651 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-bfb5m" podUID="e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2" Jan 14 01:34:46.346559 update_engine[1653]: I20260114 01:34:46.346340 1653 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 14 01:34:46.346559 update_engine[1653]: I20260114 01:34:46.346472 1653 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 14 01:34:46.347425 update_engine[1653]: I20260114 01:34:46.347380 1653 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 14 01:34:46.353783 update_engine[1653]: E20260114 01:34:46.353099 1653 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Jan 14 01:34:46.353783 update_engine[1653]: I20260114 01:34:46.353191 1653 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Jan 14 01:34:46.353783 update_engine[1653]: I20260114 01:34:46.353199 1653 omaha_request_action.cc:617] Omaha request response: Jan 14 01:34:46.353783 update_engine[1653]: E20260114 01:34:46.353289 1653 omaha_request_action.cc:636] Omaha request network transfer failed. Jan 14 01:34:46.353783 update_engine[1653]: I20260114 01:34:46.353315 1653 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Jan 14 01:34:46.353783 update_engine[1653]: I20260114 01:34:46.353320 1653 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Jan 14 01:34:46.353783 update_engine[1653]: I20260114 01:34:46.353325 1653 update_attempter.cc:306] Processing Done. Jan 14 01:34:46.353783 update_engine[1653]: E20260114 01:34:46.353340 1653 update_attempter.cc:619] Update failed. Jan 14 01:34:46.353783 update_engine[1653]: I20260114 01:34:46.353346 1653 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Jan 14 01:34:46.353783 update_engine[1653]: I20260114 01:34:46.353353 1653 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Jan 14 01:34:46.353783 update_engine[1653]: I20260114 01:34:46.353358 1653 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Jan 14 01:34:46.353783 update_engine[1653]: I20260114 01:34:46.353435 1653 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Jan 14 01:34:46.353783 update_engine[1653]: I20260114 01:34:46.353461 1653 omaha_request_action.cc:271] Posting an Omaha request to disabled Jan 14 01:34:46.353783 update_engine[1653]: I20260114 01:34:46.353466 1653 omaha_request_action.cc:272] Request: Jan 14 01:34:46.353783 update_engine[1653]: Jan 14 01:34:46.353783 update_engine[1653]: Jan 14 01:34:46.354217 update_engine[1653]: Jan 14 01:34:46.354217 update_engine[1653]: Jan 14 01:34:46.354217 update_engine[1653]: Jan 14 01:34:46.354217 update_engine[1653]: Jan 14 01:34:46.354217 update_engine[1653]: I20260114 01:34:46.353473 1653 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 14 01:34:46.354217 update_engine[1653]: I20260114 01:34:46.353493 1653 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 14 01:34:46.354217 update_engine[1653]: I20260114 01:34:46.353750 1653 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 14 01:34:46.354343 locksmithd[1700]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Jan 14 01:34:46.361315 update_engine[1653]: E20260114 01:34:46.361131 1653 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Jan 14 01:34:46.361315 update_engine[1653]: I20260114 01:34:46.361229 1653 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Jan 14 01:34:46.361315 update_engine[1653]: I20260114 01:34:46.361238 1653 omaha_request_action.cc:617] Omaha request response: Jan 14 01:34:46.361315 update_engine[1653]: I20260114 01:34:46.361246 1653 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Jan 14 01:34:46.361315 update_engine[1653]: I20260114 01:34:46.361251 1653 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Jan 14 01:34:46.361315 update_engine[1653]: I20260114 01:34:46.361256 1653 update_attempter.cc:306] Processing Done. Jan 14 01:34:46.361315 update_engine[1653]: I20260114 01:34:46.361263 1653 update_attempter.cc:310] Error event sent. Jan 14 01:34:46.361315 update_engine[1653]: I20260114 01:34:46.361274 1653 update_check_scheduler.cc:74] Next update check in 42m34s Jan 14 01:34:46.362204 locksmithd[1700]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Jan 14 01:34:48.769873 kubelet[2924]: E0114 01:34:48.769822 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-7rtbs" podUID="327b5d4f-3ff4-43f6-98c0-8969f3f3f2d2" Jan 14 01:34:51.768993 kubelet[2924]: E0114 01:34:51.768953 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-gknfk" podUID="9e0350f1-074c-4801-8433-6b63afe081c2" Jan 14 01:34:51.770087 kubelet[2924]: E0114 01:34:51.770056 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-fcdf56664-kjljc" podUID="5694b728-96e4-405e-ad55-bbb10255a07e" Jan 14 01:34:51.771504 kubelet[2924]: E0114 01:34:51.770116 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-qjrqf" podUID="a4130f46-1c6e-474c-9a1c-5fd1820934c0" Jan 14 01:34:58.769386 kubelet[2924]: E0114 01:34:58.769010 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-84b7d46b9c-rw586" podUID="7534bfa4-9af8-4640-8306-673448a61bb0" Jan 14 01:34:59.769570 kubelet[2924]: E0114 01:34:59.769527 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-bfb5m" podUID="e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2" Jan 14 01:35:01.769352 kubelet[2924]: E0114 01:35:01.769309 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-7rtbs" podUID="327b5d4f-3ff4-43f6-98c0-8969f3f3f2d2" Jan 14 01:35:03.769347 kubelet[2924]: E0114 01:35:03.769293 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-gknfk" podUID="9e0350f1-074c-4801-8433-6b63afe081c2" Jan 14 01:35:05.770281 kubelet[2924]: E0114 01:35:05.770239 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-qjrqf" podUID="a4130f46-1c6e-474c-9a1c-5fd1820934c0" Jan 14 01:35:06.770578 kubelet[2924]: E0114 01:35:06.770534 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-fcdf56664-kjljc" podUID="5694b728-96e4-405e-ad55-bbb10255a07e" Jan 14 01:35:11.769840 kubelet[2924]: E0114 01:35:11.769804 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-84b7d46b9c-rw586" podUID="7534bfa4-9af8-4640-8306-673448a61bb0" Jan 14 01:35:13.770942 kubelet[2924]: E0114 01:35:13.770885 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-bfb5m" podUID="e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2" Jan 14 01:35:16.771797 kubelet[2924]: E0114 01:35:16.771742 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-qjrqf" podUID="a4130f46-1c6e-474c-9a1c-5fd1820934c0" Jan 14 01:35:16.771797 kubelet[2924]: E0114 01:35:16.771808 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-7rtbs" podUID="327b5d4f-3ff4-43f6-98c0-8969f3f3f2d2" Jan 14 01:35:17.769707 kubelet[2924]: E0114 01:35:17.769406 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-gknfk" podUID="9e0350f1-074c-4801-8433-6b63afe081c2" Jan 14 01:35:17.770926 kubelet[2924]: E0114 01:35:17.770901 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-fcdf56664-kjljc" podUID="5694b728-96e4-405e-ad55-bbb10255a07e" Jan 14 01:35:23.769170 kubelet[2924]: E0114 01:35:23.768795 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-84b7d46b9c-rw586" podUID="7534bfa4-9af8-4640-8306-673448a61bb0" Jan 14 01:35:27.769525 containerd[1684]: time="2026-01-14T01:35:27.769491375Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 14 01:35:28.094182 containerd[1684]: time="2026-01-14T01:35:28.094059813Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:35:28.096973 containerd[1684]: time="2026-01-14T01:35:28.096925750Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 14 01:35:28.097052 containerd[1684]: time="2026-01-14T01:35:28.097006446Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 14 01:35:28.097245 kubelet[2924]: E0114 01:35:28.097194 2924 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 14 01:35:28.097550 kubelet[2924]: E0114 01:35:28.097250 2924 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 14 01:35:28.097550 kubelet[2924]: E0114 01:35:28.097384 2924 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-545mr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-bfb5m_calico-system(e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 14 01:35:28.099322 containerd[1684]: time="2026-01-14T01:35:28.099261300Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 14 01:35:28.441521 containerd[1684]: time="2026-01-14T01:35:28.441412744Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:35:28.443000 containerd[1684]: time="2026-01-14T01:35:28.442961619Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 14 01:35:28.443076 containerd[1684]: time="2026-01-14T01:35:28.443039014Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 14 01:35:28.443206 kubelet[2924]: E0114 01:35:28.443167 2924 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 14 01:35:28.443252 kubelet[2924]: E0114 01:35:28.443207 2924 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 14 01:35:28.443330 kubelet[2924]: E0114 01:35:28.443302 2924 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-545mr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-bfb5m_calico-system(e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 14 01:35:28.444609 kubelet[2924]: E0114 01:35:28.444580 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-bfb5m" podUID="e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2" Jan 14 01:35:28.770965 containerd[1684]: time="2026-01-14T01:35:28.770552084Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 14 01:35:29.103536 containerd[1684]: time="2026-01-14T01:35:29.103164929Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:35:29.105011 containerd[1684]: time="2026-01-14T01:35:29.104926230Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 14 01:35:29.105011 containerd[1684]: time="2026-01-14T01:35:29.104972505Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 14 01:35:29.105291 kubelet[2924]: E0114 01:35:29.105241 2924 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:35:29.106035 kubelet[2924]: E0114 01:35:29.105460 2924 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:35:29.106035 kubelet[2924]: E0114 01:35:29.105586 2924 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fmnkd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-5f6c79b459-7rtbs_calico-apiserver(327b5d4f-3ff4-43f6-98c0-8969f3f3f2d2): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 14 01:35:29.106787 kubelet[2924]: E0114 01:35:29.106760 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-7rtbs" podUID="327b5d4f-3ff4-43f6-98c0-8969f3f3f2d2" Jan 14 01:35:30.772087 containerd[1684]: time="2026-01-14T01:35:30.771237106Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 14 01:35:31.096925 containerd[1684]: time="2026-01-14T01:35:31.096501554Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:35:31.098266 containerd[1684]: time="2026-01-14T01:35:31.098171834Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 14 01:35:31.098266 containerd[1684]: time="2026-01-14T01:35:31.098220925Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 14 01:35:31.098429 kubelet[2924]: E0114 01:35:31.098376 2924 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:35:31.098734 kubelet[2924]: E0114 01:35:31.098426 2924 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:35:31.098734 kubelet[2924]: E0114 01:35:31.098558 2924 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5jdx6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-5f6c79b459-qjrqf_calico-apiserver(a4130f46-1c6e-474c-9a1c-5fd1820934c0): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 14 01:35:31.100070 kubelet[2924]: E0114 01:35:31.100039 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-qjrqf" podUID="a4130f46-1c6e-474c-9a1c-5fd1820934c0" Jan 14 01:35:32.772574 containerd[1684]: time="2026-01-14T01:35:32.772287481Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 14 01:35:33.110969 containerd[1684]: time="2026-01-14T01:35:33.110649783Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:35:33.113199 containerd[1684]: time="2026-01-14T01:35:33.113158041Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 14 01:35:33.113939 containerd[1684]: time="2026-01-14T01:35:33.113239418Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 14 01:35:33.115042 kubelet[2924]: E0114 01:35:33.114989 2924 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 14 01:35:33.115346 kubelet[2924]: E0114 01:35:33.115051 2924 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 14 01:35:33.115346 kubelet[2924]: E0114 01:35:33.115279 2924 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:a26f849dc23e45ddb5523da6bae47563,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fqs2g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-fcdf56664-kjljc_calico-system(5694b728-96e4-405e-ad55-bbb10255a07e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 14 01:35:33.115767 containerd[1684]: time="2026-01-14T01:35:33.115739302Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 14 01:35:33.458664 containerd[1684]: time="2026-01-14T01:35:33.458252973Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:35:33.460338 containerd[1684]: time="2026-01-14T01:35:33.460300263Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 14 01:35:33.460426 containerd[1684]: time="2026-01-14T01:35:33.460373780Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 14 01:35:33.460559 kubelet[2924]: E0114 01:35:33.460519 2924 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 14 01:35:33.460616 kubelet[2924]: E0114 01:35:33.460560 2924 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 14 01:35:33.461790 containerd[1684]: time="2026-01-14T01:35:33.461727649Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 14 01:35:33.462034 kubelet[2924]: E0114 01:35:33.461979 2924 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pd8ps,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-gknfk_calico-system(9e0350f1-074c-4801-8433-6b63afe081c2): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 14 01:35:33.463319 kubelet[2924]: E0114 01:35:33.463295 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-gknfk" podUID="9e0350f1-074c-4801-8433-6b63afe081c2" Jan 14 01:35:33.789414 containerd[1684]: time="2026-01-14T01:35:33.789365215Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:35:33.794759 containerd[1684]: time="2026-01-14T01:35:33.794710509Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 14 01:35:33.794880 containerd[1684]: time="2026-01-14T01:35:33.794740038Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 14 01:35:33.795727 kubelet[2924]: E0114 01:35:33.795691 2924 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 14 01:35:33.795798 kubelet[2924]: E0114 01:35:33.795737 2924 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 14 01:35:33.795882 kubelet[2924]: E0114 01:35:33.795834 2924 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fqs2g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-fcdf56664-kjljc_calico-system(5694b728-96e4-405e-ad55-bbb10255a07e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 14 01:35:33.797165 kubelet[2924]: E0114 01:35:33.797139 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-fcdf56664-kjljc" podUID="5694b728-96e4-405e-ad55-bbb10255a07e" Jan 14 01:35:34.771696 containerd[1684]: time="2026-01-14T01:35:34.771482037Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 14 01:35:35.099377 containerd[1684]: time="2026-01-14T01:35:35.099242303Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:35:35.102876 containerd[1684]: time="2026-01-14T01:35:35.101783516Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 14 01:35:35.103026 containerd[1684]: time="2026-01-14T01:35:35.101855424Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 14 01:35:35.103172 kubelet[2924]: E0114 01:35:35.103134 2924 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 14 01:35:35.103612 kubelet[2924]: E0114 01:35:35.103182 2924 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 14 01:35:35.103612 kubelet[2924]: E0114 01:35:35.103293 2924 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l7742,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-84b7d46b9c-rw586_calico-system(7534bfa4-9af8-4640-8306-673448a61bb0): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 14 01:35:35.104967 kubelet[2924]: E0114 01:35:35.104939 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-84b7d46b9c-rw586" podUID="7534bfa4-9af8-4640-8306-673448a61bb0" Jan 14 01:35:40.771668 kubelet[2924]: E0114 01:35:40.771104 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-7rtbs" podUID="327b5d4f-3ff4-43f6-98c0-8969f3f3f2d2" Jan 14 01:35:41.770955 kubelet[2924]: E0114 01:35:41.770878 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-bfb5m" podUID="e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2" Jan 14 01:35:44.771811 kubelet[2924]: E0114 01:35:44.771771 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-fcdf56664-kjljc" podUID="5694b728-96e4-405e-ad55-bbb10255a07e" Jan 14 01:35:45.770236 kubelet[2924]: E0114 01:35:45.770201 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-qjrqf" podUID="a4130f46-1c6e-474c-9a1c-5fd1820934c0" Jan 14 01:35:47.770238 kubelet[2924]: E0114 01:35:47.770196 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-gknfk" podUID="9e0350f1-074c-4801-8433-6b63afe081c2" Jan 14 01:35:48.770612 kubelet[2924]: E0114 01:35:48.769957 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-84b7d46b9c-rw586" podUID="7534bfa4-9af8-4640-8306-673448a61bb0" Jan 14 01:35:54.770862 kubelet[2924]: E0114 01:35:54.770625 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-7rtbs" podUID="327b5d4f-3ff4-43f6-98c0-8969f3f3f2d2" Jan 14 01:35:54.774243 kubelet[2924]: E0114 01:35:54.773110 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-bfb5m" podUID="e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2" Jan 14 01:35:55.770806 kubelet[2924]: E0114 01:35:55.770714 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-fcdf56664-kjljc" podUID="5694b728-96e4-405e-ad55-bbb10255a07e" Jan 14 01:36:00.770913 kubelet[2924]: E0114 01:36:00.769873 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-qjrqf" podUID="a4130f46-1c6e-474c-9a1c-5fd1820934c0" Jan 14 01:36:01.770215 kubelet[2924]: E0114 01:36:01.769975 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-gknfk" podUID="9e0350f1-074c-4801-8433-6b63afe081c2" Jan 14 01:36:03.769724 kubelet[2924]: E0114 01:36:03.769690 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-84b7d46b9c-rw586" podUID="7534bfa4-9af8-4640-8306-673448a61bb0" Jan 14 01:36:06.774515 kubelet[2924]: E0114 01:36:06.774458 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-fcdf56664-kjljc" podUID="5694b728-96e4-405e-ad55-bbb10255a07e" Jan 14 01:36:07.770835 kubelet[2924]: E0114 01:36:07.770389 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-7rtbs" podUID="327b5d4f-3ff4-43f6-98c0-8969f3f3f2d2" Jan 14 01:36:07.773258 kubelet[2924]: E0114 01:36:07.773194 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-bfb5m" podUID="e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2" Jan 14 01:36:12.774193 kubelet[2924]: E0114 01:36:12.774153 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-qjrqf" podUID="a4130f46-1c6e-474c-9a1c-5fd1820934c0" Jan 14 01:36:15.769219 kubelet[2924]: E0114 01:36:15.769180 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-gknfk" podUID="9e0350f1-074c-4801-8433-6b63afe081c2" Jan 14 01:36:16.769614 kubelet[2924]: E0114 01:36:16.769071 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-84b7d46b9c-rw586" podUID="7534bfa4-9af8-4640-8306-673448a61bb0" Jan 14 01:36:18.769859 kubelet[2924]: E0114 01:36:18.769668 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-fcdf56664-kjljc" podUID="5694b728-96e4-405e-ad55-bbb10255a07e" Jan 14 01:36:19.769214 kubelet[2924]: E0114 01:36:19.769166 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-7rtbs" podUID="327b5d4f-3ff4-43f6-98c0-8969f3f3f2d2" Jan 14 01:36:22.770293 kubelet[2924]: E0114 01:36:22.770100 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-bfb5m" podUID="e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2" Jan 14 01:36:25.771002 kubelet[2924]: E0114 01:36:25.770624 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-qjrqf" podUID="a4130f46-1c6e-474c-9a1c-5fd1820934c0" Jan 14 01:36:26.769599 kubelet[2924]: E0114 01:36:26.769484 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-gknfk" podUID="9e0350f1-074c-4801-8433-6b63afe081c2" Jan 14 01:36:30.773248 kubelet[2924]: E0114 01:36:30.772505 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-84b7d46b9c-rw586" podUID="7534bfa4-9af8-4640-8306-673448a61bb0" Jan 14 01:36:32.770489 kubelet[2924]: E0114 01:36:32.769877 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-7rtbs" podUID="327b5d4f-3ff4-43f6-98c0-8969f3f3f2d2" Jan 14 01:36:32.772424 kubelet[2924]: E0114 01:36:32.770935 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-fcdf56664-kjljc" podUID="5694b728-96e4-405e-ad55-bbb10255a07e" Jan 14 01:36:36.770565 kubelet[2924]: E0114 01:36:36.770423 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-bfb5m" podUID="e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2" Jan 14 01:36:39.769869 kubelet[2924]: E0114 01:36:39.769359 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-qjrqf" podUID="a4130f46-1c6e-474c-9a1c-5fd1820934c0" Jan 14 01:36:40.769691 kubelet[2924]: E0114 01:36:40.769135 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-gknfk" podUID="9e0350f1-074c-4801-8433-6b63afe081c2" Jan 14 01:36:42.619130 containerd[1684]: time="2026-01-14T01:36:42.618983143Z" level=info msg="container event discarded" container=339def40388bcd2d97bec53b8aa77531f8334cd6f6584d19028b88d75b0aa511 type=CONTAINER_CREATED_EVENT Jan 14 01:36:42.619130 containerd[1684]: time="2026-01-14T01:36:42.619044609Z" level=info msg="container event discarded" container=339def40388bcd2d97bec53b8aa77531f8334cd6f6584d19028b88d75b0aa511 type=CONTAINER_STARTED_EVENT Jan 14 01:36:42.646631 containerd[1684]: time="2026-01-14T01:36:42.646566909Z" level=info msg="container event discarded" container=0434e14efaa8c4e670c5ddc44ad3ee32972bc016d7d0915a7a5150e5417c6db9 type=CONTAINER_CREATED_EVENT Jan 14 01:36:42.646631 containerd[1684]: time="2026-01-14T01:36:42.646622672Z" level=info msg="container event discarded" container=0434e14efaa8c4e670c5ddc44ad3ee32972bc016d7d0915a7a5150e5417c6db9 type=CONTAINER_STARTED_EVENT Jan 14 01:36:42.659732 containerd[1684]: time="2026-01-14T01:36:42.659509839Z" level=info msg="container event discarded" container=c856333ddff7d5efec92cddb11039dbb6fb16280428a94f9e8aab79fcf8d1f0d type=CONTAINER_CREATED_EVENT Jan 14 01:36:42.670745 containerd[1684]: time="2026-01-14T01:36:42.670704964Z" level=info msg="container event discarded" container=f5006eae0a1916432390383d5fa63d8a2cbee1e3dc9dae2a84a3dd1d6b8cad3a type=CONTAINER_CREATED_EVENT Jan 14 01:36:42.670939 containerd[1684]: time="2026-01-14T01:36:42.670884166Z" level=info msg="container event discarded" container=f5006eae0a1916432390383d5fa63d8a2cbee1e3dc9dae2a84a3dd1d6b8cad3a type=CONTAINER_STARTED_EVENT Jan 14 01:36:42.670939 containerd[1684]: time="2026-01-14T01:36:42.670895604Z" level=info msg="container event discarded" container=b41062327d0f4eac335e2505dccc3170193ba8efe0145922c43ed640c2af6956 type=CONTAINER_CREATED_EVENT Jan 14 01:36:42.704505 containerd[1684]: time="2026-01-14T01:36:42.704453005Z" level=info msg="container event discarded" container=958152e2a9b58113f9267aa26199d633f3011875c049b42bfa1ccea1bb39c893 type=CONTAINER_CREATED_EVENT Jan 14 01:36:42.765922 containerd[1684]: time="2026-01-14T01:36:42.765861345Z" level=info msg="container event discarded" container=c856333ddff7d5efec92cddb11039dbb6fb16280428a94f9e8aab79fcf8d1f0d type=CONTAINER_STARTED_EVENT Jan 14 01:36:42.818394 containerd[1684]: time="2026-01-14T01:36:42.818325884Z" level=info msg="container event discarded" container=958152e2a9b58113f9267aa26199d633f3011875c049b42bfa1ccea1bb39c893 type=CONTAINER_STARTED_EVENT Jan 14 01:36:42.837592 containerd[1684]: time="2026-01-14T01:36:42.837540265Z" level=info msg="container event discarded" container=b41062327d0f4eac335e2505dccc3170193ba8efe0145922c43ed640c2af6956 type=CONTAINER_STARTED_EVENT Jan 14 01:36:45.769969 kubelet[2924]: E0114 01:36:45.769918 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-84b7d46b9c-rw586" podUID="7534bfa4-9af8-4640-8306-673448a61bb0" Jan 14 01:36:45.771135 kubelet[2924]: E0114 01:36:45.771026 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-fcdf56664-kjljc" podUID="5694b728-96e4-405e-ad55-bbb10255a07e" Jan 14 01:36:46.762755 systemd[1805]: Created slice background.slice - User Background Tasks Slice. Jan 14 01:36:46.766083 systemd[1805]: Starting systemd-tmpfiles-clean.service - Cleanup of User's Temporary Files and Directories... Jan 14 01:36:46.817946 systemd[1805]: Finished systemd-tmpfiles-clean.service - Cleanup of User's Temporary Files and Directories. Jan 14 01:36:47.770736 kubelet[2924]: E0114 01:36:47.770692 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-7rtbs" podUID="327b5d4f-3ff4-43f6-98c0-8969f3f3f2d2" Jan 14 01:36:47.771797 kubelet[2924]: E0114 01:36:47.771759 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-bfb5m" podUID="e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2" Jan 14 01:36:53.223402 containerd[1684]: time="2026-01-14T01:36:53.223299779Z" level=info msg="container event discarded" container=1d63f9a299576723b3516f0bce422fec640509a4fc56c5254388ff09c55082fe type=CONTAINER_CREATED_EVENT Jan 14 01:36:53.223973 containerd[1684]: time="2026-01-14T01:36:53.223925669Z" level=info msg="container event discarded" container=1d63f9a299576723b3516f0bce422fec640509a4fc56c5254388ff09c55082fe type=CONTAINER_STARTED_EVENT Jan 14 01:36:53.251283 containerd[1684]: time="2026-01-14T01:36:53.251190276Z" level=info msg="container event discarded" container=c6bc930816f35328a443579edad5c02d14dcbee7cc4dd60fc030b7d8fcf03ad5 type=CONTAINER_CREATED_EVENT Jan 14 01:36:53.339528 containerd[1684]: time="2026-01-14T01:36:53.339476288Z" level=info msg="container event discarded" container=c6bc930816f35328a443579edad5c02d14dcbee7cc4dd60fc030b7d8fcf03ad5 type=CONTAINER_STARTED_EVENT Jan 14 01:36:53.528444 containerd[1684]: time="2026-01-14T01:36:53.528012046Z" level=info msg="container event discarded" container=2aabc73ce5ac253664affec3bd413335d4c368cc331f327529143dcd8fe9fbe9 type=CONTAINER_CREATED_EVENT Jan 14 01:36:53.528634 containerd[1684]: time="2026-01-14T01:36:53.528519705Z" level=info msg="container event discarded" container=2aabc73ce5ac253664affec3bd413335d4c368cc331f327529143dcd8fe9fbe9 type=CONTAINER_STARTED_EVENT Jan 14 01:36:53.769896 kubelet[2924]: E0114 01:36:53.769861 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-qjrqf" podUID="a4130f46-1c6e-474c-9a1c-5fd1820934c0" Jan 14 01:36:53.770246 kubelet[2924]: E0114 01:36:53.770133 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-gknfk" podUID="9e0350f1-074c-4801-8433-6b63afe081c2" Jan 14 01:36:55.942081 containerd[1684]: time="2026-01-14T01:36:55.941990218Z" level=info msg="container event discarded" container=3649c6091c17dc95dac3d5b904856065a3e0232583e07c3da1815ca8e7bf266a type=CONTAINER_CREATED_EVENT Jan 14 01:36:55.994687 containerd[1684]: time="2026-01-14T01:36:55.994629381Z" level=info msg="container event discarded" container=3649c6091c17dc95dac3d5b904856065a3e0232583e07c3da1815ca8e7bf266a type=CONTAINER_STARTED_EVENT Jan 14 01:36:57.769511 kubelet[2924]: E0114 01:36:57.769458 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-84b7d46b9c-rw586" podUID="7534bfa4-9af8-4640-8306-673448a61bb0" Jan 14 01:36:58.770035 kubelet[2924]: E0114 01:36:58.769906 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-7rtbs" podUID="327b5d4f-3ff4-43f6-98c0-8969f3f3f2d2" Jan 14 01:36:59.769714 kubelet[2924]: E0114 01:36:59.769569 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-fcdf56664-kjljc" podUID="5694b728-96e4-405e-ad55-bbb10255a07e" Jan 14 01:37:01.770371 kubelet[2924]: E0114 01:37:01.770305 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-bfb5m" podUID="e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2" Jan 14 01:37:04.770765 kubelet[2924]: E0114 01:37:04.770056 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-gknfk" podUID="9e0350f1-074c-4801-8433-6b63afe081c2" Jan 14 01:37:04.772634 kubelet[2924]: E0114 01:37:04.772080 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-qjrqf" podUID="a4130f46-1c6e-474c-9a1c-5fd1820934c0" Jan 14 01:37:06.357022 containerd[1684]: time="2026-01-14T01:37:06.356887940Z" level=info msg="container event discarded" container=2b881cdb8f7de8a825950033b3cd54d9f4ee50f6a4ba3299f9b170473d97e836 type=CONTAINER_CREATED_EVENT Jan 14 01:37:06.357022 containerd[1684]: time="2026-01-14T01:37:06.357047253Z" level=info msg="container event discarded" container=2b881cdb8f7de8a825950033b3cd54d9f4ee50f6a4ba3299f9b170473d97e836 type=CONTAINER_STARTED_EVENT Jan 14 01:37:06.528453 containerd[1684]: time="2026-01-14T01:37:06.528381963Z" level=info msg="container event discarded" container=c8e8830ff37a3c2550983126a109b1927210e3215e2ce4f8006c0bd02ebd6563 type=CONTAINER_CREATED_EVENT Jan 14 01:37:06.528453 containerd[1684]: time="2026-01-14T01:37:06.528434834Z" level=info msg="container event discarded" container=c8e8830ff37a3c2550983126a109b1927210e3215e2ce4f8006c0bd02ebd6563 type=CONTAINER_STARTED_EVENT Jan 14 01:37:08.771451 kubelet[2924]: E0114 01:37:08.770934 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-84b7d46b9c-rw586" podUID="7534bfa4-9af8-4640-8306-673448a61bb0" Jan 14 01:37:08.902737 containerd[1684]: time="2026-01-14T01:37:08.902629298Z" level=info msg="container event discarded" container=19a959898132eeeb90eebb99dc125382e7451d0d95813bb4aa89d857aecde4cf type=CONTAINER_CREATED_EVENT Jan 14 01:37:08.972066 containerd[1684]: time="2026-01-14T01:37:08.972008463Z" level=info msg="container event discarded" container=19a959898132eeeb90eebb99dc125382e7451d0d95813bb4aa89d857aecde4cf type=CONTAINER_STARTED_EVENT Jan 14 01:37:09.768919 kubelet[2924]: E0114 01:37:09.768879 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-7rtbs" podUID="327b5d4f-3ff4-43f6-98c0-8969f3f3f2d2" Jan 14 01:37:10.422179 containerd[1684]: time="2026-01-14T01:37:10.422114976Z" level=info msg="container event discarded" container=527c88e1e69b9e8b23e98ce22d88f69667f217c0ce0ee9b1a4564c049cee532e type=CONTAINER_CREATED_EVENT Jan 14 01:37:10.517703 containerd[1684]: time="2026-01-14T01:37:10.517358893Z" level=info msg="container event discarded" container=527c88e1e69b9e8b23e98ce22d88f69667f217c0ce0ee9b1a4564c049cee532e type=CONTAINER_STARTED_EVENT Jan 14 01:37:12.152681 containerd[1684]: time="2026-01-14T01:37:12.152563530Z" level=info msg="container event discarded" container=527c88e1e69b9e8b23e98ce22d88f69667f217c0ce0ee9b1a4564c049cee532e type=CONTAINER_STOPPED_EVENT Jan 14 01:37:12.771546 kubelet[2924]: E0114 01:37:12.771490 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-bfb5m" podUID="e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2" Jan 14 01:37:12.772083 kubelet[2924]: E0114 01:37:12.771559 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-fcdf56664-kjljc" podUID="5694b728-96e4-405e-ad55-bbb10255a07e" Jan 14 01:37:15.769444 kubelet[2924]: E0114 01:37:15.769403 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-qjrqf" podUID="a4130f46-1c6e-474c-9a1c-5fd1820934c0" Jan 14 01:37:16.658855 containerd[1684]: time="2026-01-14T01:37:16.658764387Z" level=info msg="container event discarded" container=4f6272efb5c5f549e0a57ec3c6ea3768280e6d04a437389be29e4791419ed936 type=CONTAINER_CREATED_EVENT Jan 14 01:37:16.769933 containerd[1684]: time="2026-01-14T01:37:16.769886855Z" level=info msg="container event discarded" container=4f6272efb5c5f549e0a57ec3c6ea3768280e6d04a437389be29e4791419ed936 type=CONTAINER_STARTED_EVENT Jan 14 01:37:16.770587 kubelet[2924]: E0114 01:37:16.770346 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-gknfk" podUID="9e0350f1-074c-4801-8433-6b63afe081c2" Jan 14 01:37:19.604385 containerd[1684]: time="2026-01-14T01:37:19.604319820Z" level=info msg="container event discarded" container=4f6272efb5c5f549e0a57ec3c6ea3768280e6d04a437389be29e4791419ed936 type=CONTAINER_STOPPED_EVENT Jan 14 01:37:21.769328 kubelet[2924]: E0114 01:37:21.769282 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-7rtbs" podUID="327b5d4f-3ff4-43f6-98c0-8969f3f3f2d2" Jan 14 01:37:23.769658 kubelet[2924]: E0114 01:37:23.769616 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-84b7d46b9c-rw586" podUID="7534bfa4-9af8-4640-8306-673448a61bb0" Jan 14 01:37:23.770667 kubelet[2924]: E0114 01:37:23.770629 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-fcdf56664-kjljc" podUID="5694b728-96e4-405e-ad55-bbb10255a07e" Jan 14 01:37:27.761736 containerd[1684]: time="2026-01-14T01:37:27.761675513Z" level=info msg="container event discarded" container=2549bd9b09cc1e1cf1a918b0a6d0f803d4b29f5c59b60e06bd28b783d0164f63 type=CONTAINER_CREATED_EVENT Jan 14 01:37:27.768988 kubelet[2924]: E0114 01:37:27.768899 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-qjrqf" podUID="a4130f46-1c6e-474c-9a1c-5fd1820934c0" Jan 14 01:37:27.772650 kubelet[2924]: E0114 01:37:27.772619 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-bfb5m" podUID="e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2" Jan 14 01:37:27.941377 containerd[1684]: time="2026-01-14T01:37:27.941284559Z" level=info msg="container event discarded" container=2549bd9b09cc1e1cf1a918b0a6d0f803d4b29f5c59b60e06bd28b783d0164f63 type=CONTAINER_STARTED_EVENT Jan 14 01:37:29.769595 kubelet[2924]: E0114 01:37:29.769084 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-gknfk" podUID="9e0350f1-074c-4801-8433-6b63afe081c2" Jan 14 01:37:30.093292 containerd[1684]: time="2026-01-14T01:37:30.093155304Z" level=info msg="container event discarded" container=6ea3fd0294eb1ef145952b18df125a07a00845586070630015d1edf9f86315d5 type=CONTAINER_CREATED_EVENT Jan 14 01:37:30.093292 containerd[1684]: time="2026-01-14T01:37:30.093212394Z" level=info msg="container event discarded" container=6ea3fd0294eb1ef145952b18df125a07a00845586070630015d1edf9f86315d5 type=CONTAINER_STARTED_EVENT Jan 14 01:37:31.041967 containerd[1684]: time="2026-01-14T01:37:31.041883141Z" level=info msg="container event discarded" container=ec565caa14faf3143ebea5b85d82d4c357e0b3ecb882f06a7e2b71060dae815d type=CONTAINER_CREATED_EVENT Jan 14 01:37:31.041967 containerd[1684]: time="2026-01-14T01:37:31.041942127Z" level=info msg="container event discarded" container=ec565caa14faf3143ebea5b85d82d4c357e0b3ecb882f06a7e2b71060dae815d type=CONTAINER_STARTED_EVENT Jan 14 01:37:32.047333 containerd[1684]: time="2026-01-14T01:37:32.047270303Z" level=info msg="container event discarded" container=f2de219dd6b052fc74f5af64507090ceafc468fe93b70bbd742d2052b3c37e9c type=CONTAINER_CREATED_EVENT Jan 14 01:37:32.047333 containerd[1684]: time="2026-01-14T01:37:32.047313855Z" level=info msg="container event discarded" container=f2de219dd6b052fc74f5af64507090ceafc468fe93b70bbd742d2052b3c37e9c type=CONTAINER_STARTED_EVENT Jan 14 01:37:33.105870 containerd[1684]: time="2026-01-14T01:37:33.105696491Z" level=info msg="container event discarded" container=360b8640bd42108f7b47aba2cfaffe083fdf6f1f478e5eca5e47a53218c2f201 type=CONTAINER_CREATED_EVENT Jan 14 01:37:33.105870 containerd[1684]: time="2026-01-14T01:37:33.105787921Z" level=info msg="container event discarded" container=360b8640bd42108f7b47aba2cfaffe083fdf6f1f478e5eca5e47a53218c2f201 type=CONTAINER_STARTED_EVENT Jan 14 01:37:33.138225 containerd[1684]: time="2026-01-14T01:37:33.138127548Z" level=info msg="container event discarded" container=056a93c8f65c81eba837a74feae56b4dae58ef09c72bc2892391fe34aa39768c type=CONTAINER_CREATED_EVENT Jan 14 01:37:33.192732 containerd[1684]: time="2026-01-14T01:37:33.192661407Z" level=info msg="container event discarded" container=056a93c8f65c81eba837a74feae56b4dae58ef09c72bc2892391fe34aa39768c type=CONTAINER_STARTED_EVENT Jan 14 01:37:34.159788 containerd[1684]: time="2026-01-14T01:37:34.159735201Z" level=info msg="container event discarded" container=ae9de380d71f2ed7cf6dfe48ad3f7eae1a058876fadddca6b5f18d2c7e20a62f type=CONTAINER_CREATED_EVENT Jan 14 01:37:34.160551 containerd[1684]: time="2026-01-14T01:37:34.160205024Z" level=info msg="container event discarded" container=ae9de380d71f2ed7cf6dfe48ad3f7eae1a058876fadddca6b5f18d2c7e20a62f type=CONTAINER_STARTED_EVENT Jan 14 01:37:34.389599 containerd[1684]: time="2026-01-14T01:37:34.389519557Z" level=info msg="container event discarded" container=3ab96a7a1ebbe8f834754fc647670efa99c73a85fe80ee8f34864f69bb7f0101 type=CONTAINER_CREATED_EVENT Jan 14 01:37:34.389599 containerd[1684]: time="2026-01-14T01:37:34.389563021Z" level=info msg="container event discarded" container=3ab96a7a1ebbe8f834754fc647670efa99c73a85fe80ee8f34864f69bb7f0101 type=CONTAINER_STARTED_EVENT Jan 14 01:37:34.456271 containerd[1684]: time="2026-01-14T01:37:34.455743138Z" level=info msg="container event discarded" container=16920b7dc6e39f43ec60aa5080a55c873879c2518b9981a20169f3e49def380b type=CONTAINER_CREATED_EVENT Jan 14 01:37:34.456271 containerd[1684]: time="2026-01-14T01:37:34.455821998Z" level=info msg="container event discarded" container=16920b7dc6e39f43ec60aa5080a55c873879c2518b9981a20169f3e49def380b type=CONTAINER_STARTED_EVENT Jan 14 01:37:34.456271 containerd[1684]: time="2026-01-14T01:37:34.455831721Z" level=info msg="container event discarded" container=ec5aa91e1d50ae77bbd883909ac647a127f71f0b05e5a6ed670a783fcdb9090c type=CONTAINER_CREATED_EVENT Jan 14 01:37:34.533906 containerd[1684]: time="2026-01-14T01:37:34.533836202Z" level=info msg="container event discarded" container=ec5aa91e1d50ae77bbd883909ac647a127f71f0b05e5a6ed670a783fcdb9090c type=CONTAINER_STARTED_EVENT Jan 14 01:37:35.067663 containerd[1684]: time="2026-01-14T01:37:35.067588865Z" level=info msg="container event discarded" container=ae0b959ae7cc64b4d94c90277f991f4dcef59aee81b320265c738b40eaad124a type=CONTAINER_CREATED_EVENT Jan 14 01:37:35.067885 containerd[1684]: time="2026-01-14T01:37:35.067860335Z" level=info msg="container event discarded" container=ae0b959ae7cc64b4d94c90277f991f4dcef59aee81b320265c738b40eaad124a type=CONTAINER_STARTED_EVENT Jan 14 01:37:35.769594 kubelet[2924]: E0114 01:37:35.769547 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-7rtbs" podUID="327b5d4f-3ff4-43f6-98c0-8969f3f3f2d2" Jan 14 01:37:36.772061 kubelet[2924]: E0114 01:37:36.772021 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-84b7d46b9c-rw586" podUID="7534bfa4-9af8-4640-8306-673448a61bb0" Jan 14 01:37:37.771933 kubelet[2924]: E0114 01:37:37.771878 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-fcdf56664-kjljc" podUID="5694b728-96e4-405e-ad55-bbb10255a07e" Jan 14 01:37:41.769512 kubelet[2924]: E0114 01:37:41.769457 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-qjrqf" podUID="a4130f46-1c6e-474c-9a1c-5fd1820934c0" Jan 14 01:37:42.772548 kubelet[2924]: E0114 01:37:42.772470 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-bfb5m" podUID="e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2" Jan 14 01:37:43.768880 kubelet[2924]: E0114 01:37:43.768694 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-gknfk" podUID="9e0350f1-074c-4801-8433-6b63afe081c2" Jan 14 01:37:46.776209 kubelet[2924]: E0114 01:37:46.775959 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-7rtbs" podUID="327b5d4f-3ff4-43f6-98c0-8969f3f3f2d2" Jan 14 01:37:49.770405 kubelet[2924]: E0114 01:37:49.770365 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-fcdf56664-kjljc" podUID="5694b728-96e4-405e-ad55-bbb10255a07e" Jan 14 01:37:51.770666 kubelet[2924]: E0114 01:37:51.770397 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-84b7d46b9c-rw586" podUID="7534bfa4-9af8-4640-8306-673448a61bb0" Jan 14 01:37:52.771481 kubelet[2924]: E0114 01:37:52.771431 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-qjrqf" podUID="a4130f46-1c6e-474c-9a1c-5fd1820934c0" Jan 14 01:37:55.771415 kubelet[2924]: E0114 01:37:55.771370 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-bfb5m" podUID="e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2" Jan 14 01:37:58.772146 kubelet[2924]: E0114 01:37:58.771499 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-gknfk" podUID="9e0350f1-074c-4801-8433-6b63afe081c2" Jan 14 01:37:59.770418 kubelet[2924]: E0114 01:37:59.770346 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-7rtbs" podUID="327b5d4f-3ff4-43f6-98c0-8969f3f3f2d2" Jan 14 01:38:03.770653 kubelet[2924]: E0114 01:38:03.770250 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-84b7d46b9c-rw586" podUID="7534bfa4-9af8-4640-8306-673448a61bb0" Jan 14 01:38:03.771415 kubelet[2924]: E0114 01:38:03.771367 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-fcdf56664-kjljc" podUID="5694b728-96e4-405e-ad55-bbb10255a07e" Jan 14 01:38:05.770274 kubelet[2924]: E0114 01:38:05.770228 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-qjrqf" podUID="a4130f46-1c6e-474c-9a1c-5fd1820934c0" Jan 14 01:38:06.771732 kubelet[2924]: E0114 01:38:06.771648 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-bfb5m" podUID="e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2" Jan 14 01:38:11.769164 kubelet[2924]: E0114 01:38:11.768896 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-gknfk" podUID="9e0350f1-074c-4801-8433-6b63afe081c2" Jan 14 01:38:13.769872 containerd[1684]: time="2026-01-14T01:38:13.769779030Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 14 01:38:14.108433 containerd[1684]: time="2026-01-14T01:38:14.107343459Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:38:14.110220 containerd[1684]: time="2026-01-14T01:38:14.110089227Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 14 01:38:14.110596 containerd[1684]: time="2026-01-14T01:38:14.110091005Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 14 01:38:14.110796 kubelet[2924]: E0114 01:38:14.110748 2924 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:38:14.110796 kubelet[2924]: E0114 01:38:14.110789 2924 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:38:14.112177 kubelet[2924]: E0114 01:38:14.112130 2924 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fmnkd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-5f6c79b459-7rtbs_calico-apiserver(327b5d4f-3ff4-43f6-98c0-8969f3f3f2d2): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 14 01:38:14.113398 kubelet[2924]: E0114 01:38:14.113330 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-7rtbs" podUID="327b5d4f-3ff4-43f6-98c0-8969f3f3f2d2" Jan 14 01:38:14.770245 kubelet[2924]: E0114 01:38:14.770210 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-84b7d46b9c-rw586" podUID="7534bfa4-9af8-4640-8306-673448a61bb0" Jan 14 01:38:15.769620 containerd[1684]: time="2026-01-14T01:38:15.769572142Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 14 01:38:16.112691 containerd[1684]: time="2026-01-14T01:38:16.112484619Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:38:16.115300 containerd[1684]: time="2026-01-14T01:38:16.115202697Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 14 01:38:16.115461 containerd[1684]: time="2026-01-14T01:38:16.115251024Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 14 01:38:16.115707 kubelet[2924]: E0114 01:38:16.115662 2924 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 14 01:38:16.116109 kubelet[2924]: E0114 01:38:16.115728 2924 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 14 01:38:16.116109 kubelet[2924]: E0114 01:38:16.115908 2924 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:a26f849dc23e45ddb5523da6bae47563,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fqs2g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-fcdf56664-kjljc_calico-system(5694b728-96e4-405e-ad55-bbb10255a07e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 14 01:38:16.120058 containerd[1684]: time="2026-01-14T01:38:16.120018685Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 14 01:38:16.445925 containerd[1684]: time="2026-01-14T01:38:16.445525398Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:38:16.447786 containerd[1684]: time="2026-01-14T01:38:16.447739676Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 14 01:38:16.447862 containerd[1684]: time="2026-01-14T01:38:16.447823090Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 14 01:38:16.448858 kubelet[2924]: E0114 01:38:16.448134 2924 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 14 01:38:16.448858 kubelet[2924]: E0114 01:38:16.448181 2924 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 14 01:38:16.448858 kubelet[2924]: E0114 01:38:16.448306 2924 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fqs2g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-fcdf56664-kjljc_calico-system(5694b728-96e4-405e-ad55-bbb10255a07e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 14 01:38:16.450141 kubelet[2924]: E0114 01:38:16.450109 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-fcdf56664-kjljc" podUID="5694b728-96e4-405e-ad55-bbb10255a07e" Jan 14 01:38:16.772045 containerd[1684]: time="2026-01-14T01:38:16.771994686Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 14 01:38:17.113919 containerd[1684]: time="2026-01-14T01:38:17.113703331Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:38:17.115522 containerd[1684]: time="2026-01-14T01:38:17.115433928Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 14 01:38:17.115522 containerd[1684]: time="2026-01-14T01:38:17.115507004Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 14 01:38:17.115758 kubelet[2924]: E0114 01:38:17.115713 2924 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:38:17.116050 kubelet[2924]: E0114 01:38:17.115768 2924 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:38:17.116050 kubelet[2924]: E0114 01:38:17.115889 2924 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5jdx6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-5f6c79b459-qjrqf_calico-apiserver(a4130f46-1c6e-474c-9a1c-5fd1820934c0): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 14 01:38:17.117351 kubelet[2924]: E0114 01:38:17.117304 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-qjrqf" podUID="a4130f46-1c6e-474c-9a1c-5fd1820934c0" Jan 14 01:38:21.770827 containerd[1684]: time="2026-01-14T01:38:21.770578894Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 14 01:38:22.110109 containerd[1684]: time="2026-01-14T01:38:22.109972383Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:38:22.111988 containerd[1684]: time="2026-01-14T01:38:22.111952407Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 14 01:38:22.112882 containerd[1684]: time="2026-01-14T01:38:22.112029496Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 14 01:38:22.113297 kubelet[2924]: E0114 01:38:22.112988 2924 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 14 01:38:22.113297 kubelet[2924]: E0114 01:38:22.113045 2924 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 14 01:38:22.113297 kubelet[2924]: E0114 01:38:22.113160 2924 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-545mr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-bfb5m_calico-system(e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 14 01:38:22.115830 containerd[1684]: time="2026-01-14T01:38:22.115795231Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 14 01:38:22.452215 containerd[1684]: time="2026-01-14T01:38:22.452031416Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:38:22.455140 containerd[1684]: time="2026-01-14T01:38:22.454838341Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 14 01:38:22.455549 containerd[1684]: time="2026-01-14T01:38:22.454951640Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 14 01:38:22.456278 kubelet[2924]: E0114 01:38:22.456022 2924 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 14 01:38:22.456390 kubelet[2924]: E0114 01:38:22.456312 2924 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 14 01:38:22.458033 kubelet[2924]: E0114 01:38:22.457687 2924 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-545mr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-bfb5m_calico-system(e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 14 01:38:22.459820 kubelet[2924]: E0114 01:38:22.459736 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-bfb5m" podUID="e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2" Jan 14 01:38:25.772110 containerd[1684]: time="2026-01-14T01:38:25.772053940Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 14 01:38:26.095978 containerd[1684]: time="2026-01-14T01:38:26.095752304Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:38:26.097784 containerd[1684]: time="2026-01-14T01:38:26.097700640Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 14 01:38:26.097875 containerd[1684]: time="2026-01-14T01:38:26.097769408Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 14 01:38:26.097987 kubelet[2924]: E0114 01:38:26.097950 2924 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 14 01:38:26.098903 kubelet[2924]: E0114 01:38:26.098001 2924 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 14 01:38:26.098903 kubelet[2924]: E0114 01:38:26.098154 2924 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pd8ps,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-gknfk_calico-system(9e0350f1-074c-4801-8433-6b63afe081c2): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 14 01:38:26.100001 kubelet[2924]: E0114 01:38:26.099967 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-gknfk" podUID="9e0350f1-074c-4801-8433-6b63afe081c2" Jan 14 01:38:27.768685 kubelet[2924]: E0114 01:38:27.768649 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-7rtbs" podUID="327b5d4f-3ff4-43f6-98c0-8969f3f3f2d2" Jan 14 01:38:29.770869 kubelet[2924]: E0114 01:38:29.769898 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-qjrqf" podUID="a4130f46-1c6e-474c-9a1c-5fd1820934c0" Jan 14 01:38:29.772227 containerd[1684]: time="2026-01-14T01:38:29.771746644Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 14 01:38:29.772952 kubelet[2924]: E0114 01:38:29.772869 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-fcdf56664-kjljc" podUID="5694b728-96e4-405e-ad55-bbb10255a07e" Jan 14 01:38:30.093329 containerd[1684]: time="2026-01-14T01:38:30.092780015Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:38:30.095244 containerd[1684]: time="2026-01-14T01:38:30.095153068Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 14 01:38:30.095379 containerd[1684]: time="2026-01-14T01:38:30.095336570Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 14 01:38:30.095550 kubelet[2924]: E0114 01:38:30.095504 2924 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 14 01:38:30.095605 kubelet[2924]: E0114 01:38:30.095562 2924 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 14 01:38:30.095728 kubelet[2924]: E0114 01:38:30.095674 2924 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l7742,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-84b7d46b9c-rw586_calico-system(7534bfa4-9af8-4640-8306-673448a61bb0): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 14 01:38:30.097055 kubelet[2924]: E0114 01:38:30.097031 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-84b7d46b9c-rw586" podUID="7534bfa4-9af8-4640-8306-673448a61bb0" Jan 14 01:38:33.772744 kubelet[2924]: E0114 01:38:33.772668 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-bfb5m" podUID="e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2" Jan 14 01:38:39.769512 kubelet[2924]: E0114 01:38:39.769406 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-gknfk" podUID="9e0350f1-074c-4801-8433-6b63afe081c2" Jan 14 01:38:40.290824 systemd[1]: Started sshd@9-10.0.22.183:22-68.220.241.50:45090.service - OpenSSH per-connection server daemon (68.220.241.50:45090). Jan 14 01:38:40.295102 kernel: kauditd_printk_skb: 372 callbacks suppressed Jan 14 01:38:40.295196 kernel: audit: type=1130 audit(1768354720.290:735): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.0.22.183:22-68.220.241.50:45090 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:38:40.290000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.0.22.183:22-68.220.241.50:45090 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:38:40.770705 kubelet[2924]: E0114 01:38:40.770671 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-qjrqf" podUID="a4130f46-1c6e-474c-9a1c-5fd1820934c0" Jan 14 01:38:40.842000 audit[5446]: USER_ACCT pid=5446 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:38:40.846999 kernel: audit: type=1101 audit(1768354720.842:736): pid=5446 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:38:40.848941 sshd[5446]: Accepted publickey for core from 68.220.241.50 port 45090 ssh2: RSA SHA256:sbFoGtRFAtOHBsi8EGjpgLe6A0GZZFX7Mtyn33eMzVs Jan 14 01:38:40.849266 sshd-session[5446]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:38:40.847000 audit[5446]: CRED_ACQ pid=5446 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:38:40.852873 kernel: audit: type=1103 audit(1768354720.847:737): pid=5446 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:38:40.858350 systemd-logind[1652]: New session 11 of user core. Jan 14 01:38:40.858989 kernel: audit: type=1006 audit(1768354720.847:738): pid=5446 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=11 res=1 Jan 14 01:38:40.847000 audit[5446]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffff20cd9b0 a2=3 a3=0 items=0 ppid=1 pid=5446 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:38:40.863902 kernel: audit: type=1300 audit(1768354720.847:738): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffff20cd9b0 a2=3 a3=0 items=0 ppid=1 pid=5446 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:38:40.847000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:38:40.865023 systemd[1]: Started session-11.scope - Session 11 of User core. Jan 14 01:38:40.867148 kernel: audit: type=1327 audit(1768354720.847:738): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:38:40.869000 audit[5446]: USER_START pid=5446 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:38:40.871000 audit[5450]: CRED_ACQ pid=5450 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:38:40.875576 kernel: audit: type=1105 audit(1768354720.869:739): pid=5446 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:38:40.875830 kernel: audit: type=1103 audit(1768354720.871:740): pid=5450 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:38:41.218569 sshd[5450]: Connection closed by 68.220.241.50 port 45090 Jan 14 01:38:41.219713 sshd-session[5446]: pam_unix(sshd:session): session closed for user core Jan 14 01:38:41.220000 audit[5446]: USER_END pid=5446 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:38:41.220000 audit[5446]: CRED_DISP pid=5446 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:38:41.226419 systemd[1]: sshd@9-10.0.22.183:22-68.220.241.50:45090.service: Deactivated successfully. Jan 14 01:38:41.226896 kernel: audit: type=1106 audit(1768354721.220:741): pid=5446 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:38:41.226947 kernel: audit: type=1104 audit(1768354721.220:742): pid=5446 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:38:41.229238 systemd[1]: session-11.scope: Deactivated successfully. Jan 14 01:38:41.226000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.0.22.183:22-68.220.241.50:45090 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:38:41.231982 systemd-logind[1652]: Session 11 logged out. Waiting for processes to exit. Jan 14 01:38:41.233193 systemd-logind[1652]: Removed session 11. Jan 14 01:38:42.770944 kubelet[2924]: E0114 01:38:42.770498 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-7rtbs" podUID="327b5d4f-3ff4-43f6-98c0-8969f3f3f2d2" Jan 14 01:38:43.769640 kubelet[2924]: E0114 01:38:43.769605 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-84b7d46b9c-rw586" podUID="7534bfa4-9af8-4640-8306-673448a61bb0" Jan 14 01:38:43.770966 kubelet[2924]: E0114 01:38:43.770928 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-fcdf56664-kjljc" podUID="5694b728-96e4-405e-ad55-bbb10255a07e" Jan 14 01:38:46.332000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.0.22.183:22-68.220.241.50:33162 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:38:46.334115 systemd[1]: Started sshd@10-10.0.22.183:22-68.220.241.50:33162.service - OpenSSH per-connection server daemon (68.220.241.50:33162). Jan 14 01:38:46.335300 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 01:38:46.335429 kernel: audit: type=1130 audit(1768354726.332:744): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.0.22.183:22-68.220.241.50:33162 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:38:46.882000 audit[5465]: USER_ACCT pid=5465 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:38:46.886791 sshd[5465]: Accepted publickey for core from 68.220.241.50 port 33162 ssh2: RSA SHA256:sbFoGtRFAtOHBsi8EGjpgLe6A0GZZFX7Mtyn33eMzVs Jan 14 01:38:46.888863 kernel: audit: type=1101 audit(1768354726.882:745): pid=5465 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:38:46.889080 sshd-session[5465]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:38:46.885000 audit[5465]: CRED_ACQ pid=5465 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:38:46.895949 kernel: audit: type=1103 audit(1768354726.885:746): pid=5465 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:38:46.900232 systemd-logind[1652]: New session 12 of user core. Jan 14 01:38:46.885000 audit[5465]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fffd5271770 a2=3 a3=0 items=0 ppid=1 pid=5465 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=12 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:38:46.902315 kernel: audit: type=1006 audit(1768354726.885:747): pid=5465 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=12 res=1 Jan 14 01:38:46.902366 kernel: audit: type=1300 audit(1768354726.885:747): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fffd5271770 a2=3 a3=0 items=0 ppid=1 pid=5465 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=12 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:38:46.885000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:38:46.907085 systemd[1]: Started session-12.scope - Session 12 of User core. Jan 14 01:38:46.909062 kernel: audit: type=1327 audit(1768354726.885:747): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:38:46.909000 audit[5465]: USER_START pid=5465 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:38:46.916861 kernel: audit: type=1105 audit(1768354726.909:748): pid=5465 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:38:46.914000 audit[5471]: CRED_ACQ pid=5471 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:38:46.922859 kernel: audit: type=1103 audit(1768354726.914:749): pid=5471 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:38:47.243867 sshd[5471]: Connection closed by 68.220.241.50 port 33162 Jan 14 01:38:47.244317 sshd-session[5465]: pam_unix(sshd:session): session closed for user core Jan 14 01:38:47.244000 audit[5465]: USER_END pid=5465 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:38:47.250054 systemd[1]: sshd@10-10.0.22.183:22-68.220.241.50:33162.service: Deactivated successfully. Jan 14 01:38:47.251898 kernel: audit: type=1106 audit(1768354727.244:750): pid=5465 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:38:47.252599 systemd[1]: session-12.scope: Deactivated successfully. Jan 14 01:38:47.244000 audit[5465]: CRED_DISP pid=5465 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:38:47.255933 systemd-logind[1652]: Session 12 logged out. Waiting for processes to exit. Jan 14 01:38:47.257228 systemd-logind[1652]: Removed session 12. Jan 14 01:38:47.258313 kernel: audit: type=1104 audit(1768354727.244:751): pid=5465 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:38:47.247000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.0.22.183:22-68.220.241.50:33162 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:38:47.770387 kubelet[2924]: E0114 01:38:47.770335 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-bfb5m" podUID="e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2" Jan 14 01:38:52.352078 systemd[1]: Started sshd@11-10.0.22.183:22-68.220.241.50:33174.service - OpenSSH per-connection server daemon (68.220.241.50:33174). Jan 14 01:38:52.356674 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 01:38:52.356750 kernel: audit: type=1130 audit(1768354732.350:753): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-10.0.22.183:22-68.220.241.50:33174 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:38:52.350000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-10.0.22.183:22-68.220.241.50:33174 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:38:52.771536 kubelet[2924]: E0114 01:38:52.771247 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-qjrqf" podUID="a4130f46-1c6e-474c-9a1c-5fd1820934c0" Jan 14 01:38:52.880000 audit[5504]: USER_ACCT pid=5504 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:38:52.881831 sshd[5504]: Accepted publickey for core from 68.220.241.50 port 33174 ssh2: RSA SHA256:sbFoGtRFAtOHBsi8EGjpgLe6A0GZZFX7Mtyn33eMzVs Jan 14 01:38:52.889083 kernel: audit: type=1101 audit(1768354732.880:754): pid=5504 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:38:52.888000 audit[5504]: CRED_ACQ pid=5504 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:38:52.890804 sshd-session[5504]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:38:52.897889 kernel: audit: type=1103 audit(1768354732.888:755): pid=5504 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:38:52.888000 audit[5504]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff3564cbf0 a2=3 a3=0 items=0 ppid=1 pid=5504 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:38:52.907200 kernel: audit: type=1006 audit(1768354732.888:756): pid=5504 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=13 res=1 Jan 14 01:38:52.907375 kernel: audit: type=1300 audit(1768354732.888:756): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff3564cbf0 a2=3 a3=0 items=0 ppid=1 pid=5504 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:38:52.888000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:38:52.913924 kernel: audit: type=1327 audit(1768354732.888:756): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:38:52.914419 systemd-logind[1652]: New session 13 of user core. Jan 14 01:38:52.920041 systemd[1]: Started session-13.scope - Session 13 of User core. Jan 14 01:38:52.921000 audit[5504]: USER_START pid=5504 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:38:52.930876 kernel: audit: type=1105 audit(1768354732.921:757): pid=5504 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:38:52.923000 audit[5508]: CRED_ACQ pid=5508 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:38:52.934972 kernel: audit: type=1103 audit(1768354732.923:758): pid=5508 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:38:53.231454 sshd[5508]: Connection closed by 68.220.241.50 port 33174 Jan 14 01:38:53.233432 sshd-session[5504]: pam_unix(sshd:session): session closed for user core Jan 14 01:38:53.242883 kernel: audit: type=1106 audit(1768354733.233:759): pid=5504 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:38:53.233000 audit[5504]: USER_END pid=5504 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:38:53.238289 systemd[1]: sshd@11-10.0.22.183:22-68.220.241.50:33174.service: Deactivated successfully. Jan 14 01:38:53.242056 systemd[1]: session-13.scope: Deactivated successfully. Jan 14 01:38:53.246996 systemd-logind[1652]: Session 13 logged out. Waiting for processes to exit. Jan 14 01:38:53.247705 systemd-logind[1652]: Removed session 13. Jan 14 01:38:53.234000 audit[5504]: CRED_DISP pid=5504 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:38:53.237000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-10.0.22.183:22-68.220.241.50:33174 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:38:53.253968 kernel: audit: type=1104 audit(1768354733.234:760): pid=5504 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:38:53.340000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-10.0.22.183:22-68.220.241.50:55408 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:38:53.341197 systemd[1]: Started sshd@12-10.0.22.183:22-68.220.241.50:55408.service - OpenSSH per-connection server daemon (68.220.241.50:55408). Jan 14 01:38:53.769484 kubelet[2924]: E0114 01:38:53.769453 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-gknfk" podUID="9e0350f1-074c-4801-8433-6b63afe081c2" Jan 14 01:38:53.881576 sshd[5522]: Accepted publickey for core from 68.220.241.50 port 55408 ssh2: RSA SHA256:sbFoGtRFAtOHBsi8EGjpgLe6A0GZZFX7Mtyn33eMzVs Jan 14 01:38:53.880000 audit[5522]: USER_ACCT pid=5522 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:38:53.881000 audit[5522]: CRED_ACQ pid=5522 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:38:53.882000 audit[5522]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff81666710 a2=3 a3=0 items=0 ppid=1 pid=5522 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=14 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:38:53.882000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:38:53.883778 sshd-session[5522]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:38:53.889953 systemd-logind[1652]: New session 14 of user core. Jan 14 01:38:53.894334 systemd[1]: Started session-14.scope - Session 14 of User core. Jan 14 01:38:53.896000 audit[5522]: USER_START pid=5522 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:38:53.899000 audit[5528]: CRED_ACQ pid=5528 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:38:54.278578 sshd[5528]: Connection closed by 68.220.241.50 port 55408 Jan 14 01:38:54.278924 sshd-session[5522]: pam_unix(sshd:session): session closed for user core Jan 14 01:38:54.279000 audit[5522]: USER_END pid=5522 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:38:54.279000 audit[5522]: CRED_DISP pid=5522 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:38:54.282000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-10.0.22.183:22-68.220.241.50:55408 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:38:54.283146 systemd[1]: sshd@12-10.0.22.183:22-68.220.241.50:55408.service: Deactivated successfully. Jan 14 01:38:54.285402 systemd[1]: session-14.scope: Deactivated successfully. Jan 14 01:38:54.286781 systemd-logind[1652]: Session 14 logged out. Waiting for processes to exit. Jan 14 01:38:54.288450 systemd-logind[1652]: Removed session 14. Jan 14 01:38:54.387000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-10.0.22.183:22-68.220.241.50:55420 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:38:54.388072 systemd[1]: Started sshd@13-10.0.22.183:22-68.220.241.50:55420.service - OpenSSH per-connection server daemon (68.220.241.50:55420). Jan 14 01:38:54.772141 kubelet[2924]: E0114 01:38:54.771970 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-7rtbs" podUID="327b5d4f-3ff4-43f6-98c0-8969f3f3f2d2" Jan 14 01:38:54.930832 sshd[5538]: Accepted publickey for core from 68.220.241.50 port 55420 ssh2: RSA SHA256:sbFoGtRFAtOHBsi8EGjpgLe6A0GZZFX7Mtyn33eMzVs Jan 14 01:38:54.929000 audit[5538]: USER_ACCT pid=5538 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:38:54.931000 audit[5538]: CRED_ACQ pid=5538 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:38:54.932000 audit[5538]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc887aae80 a2=3 a3=0 items=0 ppid=1 pid=5538 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=15 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:38:54.932000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:38:54.933803 sshd-session[5538]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:38:54.944172 systemd-logind[1652]: New session 15 of user core. Jan 14 01:38:54.948253 systemd[1]: Started session-15.scope - Session 15 of User core. Jan 14 01:38:54.952000 audit[5538]: USER_START pid=5538 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:38:54.955000 audit[5542]: CRED_ACQ pid=5542 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:38:55.301320 sshd[5542]: Connection closed by 68.220.241.50 port 55420 Jan 14 01:38:55.303289 sshd-session[5538]: pam_unix(sshd:session): session closed for user core Jan 14 01:38:55.304000 audit[5538]: USER_END pid=5538 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:38:55.304000 audit[5538]: CRED_DISP pid=5538 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:38:55.307601 systemd[1]: sshd@13-10.0.22.183:22-68.220.241.50:55420.service: Deactivated successfully. Jan 14 01:38:55.307000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-10.0.22.183:22-68.220.241.50:55420 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:38:55.310606 systemd[1]: session-15.scope: Deactivated successfully. Jan 14 01:38:55.312554 systemd-logind[1652]: Session 15 logged out. Waiting for processes to exit. Jan 14 01:38:55.315152 systemd-logind[1652]: Removed session 15. Jan 14 01:38:57.772209 kubelet[2924]: E0114 01:38:57.771590 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-fcdf56664-kjljc" podUID="5694b728-96e4-405e-ad55-bbb10255a07e" Jan 14 01:38:58.769603 kubelet[2924]: E0114 01:38:58.769409 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-84b7d46b9c-rw586" podUID="7534bfa4-9af8-4640-8306-673448a61bb0" Jan 14 01:39:00.419666 kernel: kauditd_printk_skb: 23 callbacks suppressed Jan 14 01:39:00.419801 kernel: audit: type=1130 audit(1768354740.414:780): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.0.22.183:22-68.220.241.50:55434 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:39:00.414000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.0.22.183:22-68.220.241.50:55434 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:39:00.415291 systemd[1]: Started sshd@14-10.0.22.183:22-68.220.241.50:55434.service - OpenSSH per-connection server daemon (68.220.241.50:55434). Jan 14 01:39:00.983000 audit[5577]: USER_ACCT pid=5577 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:00.984762 sshd[5577]: Accepted publickey for core from 68.220.241.50 port 55434 ssh2: RSA SHA256:sbFoGtRFAtOHBsi8EGjpgLe6A0GZZFX7Mtyn33eMzVs Jan 14 01:39:00.986976 sshd-session[5577]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:39:00.993972 kernel: audit: type=1101 audit(1768354740.983:781): pid=5577 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:00.983000 audit[5577]: CRED_ACQ pid=5577 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:01.000874 systemd-logind[1652]: New session 16 of user core. Jan 14 01:39:01.004974 kernel: audit: type=1103 audit(1768354740.983:782): pid=5577 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:01.005144 kernel: audit: type=1006 audit(1768354740.983:783): pid=5577 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=16 res=1 Jan 14 01:39:00.983000 audit[5577]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffea04cb340 a2=3 a3=0 items=0 ppid=1 pid=5577 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:39:01.011714 kernel: audit: type=1300 audit(1768354740.983:783): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffea04cb340 a2=3 a3=0 items=0 ppid=1 pid=5577 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:39:01.012060 systemd[1]: Started session-16.scope - Session 16 of User core. Jan 14 01:39:00.983000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:39:01.018885 kernel: audit: type=1327 audit(1768354740.983:783): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:39:01.017000 audit[5577]: USER_START pid=5577 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:01.022686 kernel: audit: type=1105 audit(1768354741.017:784): pid=5577 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:01.019000 audit[5581]: CRED_ACQ pid=5581 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:01.028901 kernel: audit: type=1103 audit(1768354741.019:785): pid=5581 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:01.368893 sshd[5581]: Connection closed by 68.220.241.50 port 55434 Jan 14 01:39:01.371998 sshd-session[5577]: pam_unix(sshd:session): session closed for user core Jan 14 01:39:01.373000 audit[5577]: USER_END pid=5577 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:01.376774 systemd-logind[1652]: Session 16 logged out. Waiting for processes to exit. Jan 14 01:39:01.373000 audit[5577]: CRED_DISP pid=5577 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:01.379177 systemd[1]: sshd@14-10.0.22.183:22-68.220.241.50:55434.service: Deactivated successfully. Jan 14 01:39:01.380335 kernel: audit: type=1106 audit(1768354741.373:786): pid=5577 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:01.380388 kernel: audit: type=1104 audit(1768354741.373:787): pid=5577 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:01.382142 systemd[1]: session-16.scope: Deactivated successfully. Jan 14 01:39:01.378000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.0.22.183:22-68.220.241.50:55434 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:39:01.386541 systemd-logind[1652]: Removed session 16. Jan 14 01:39:02.773375 kubelet[2924]: E0114 01:39:02.772889 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-bfb5m" podUID="e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2" Jan 14 01:39:05.769385 kubelet[2924]: E0114 01:39:05.768751 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-qjrqf" podUID="a4130f46-1c6e-474c-9a1c-5fd1820934c0" Jan 14 01:39:06.484091 systemd[1]: Started sshd@15-10.0.22.183:22-68.220.241.50:38280.service - OpenSSH per-connection server daemon (68.220.241.50:38280). Jan 14 01:39:06.483000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-10.0.22.183:22-68.220.241.50:38280 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:39:06.488009 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 01:39:06.488060 kernel: audit: type=1130 audit(1768354746.483:789): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-10.0.22.183:22-68.220.241.50:38280 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:39:07.030000 audit[5602]: USER_ACCT pid=5602 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:07.033953 sshd[5602]: Accepted publickey for core from 68.220.241.50 port 38280 ssh2: RSA SHA256:sbFoGtRFAtOHBsi8EGjpgLe6A0GZZFX7Mtyn33eMzVs Jan 14 01:39:07.037468 kernel: audit: type=1101 audit(1768354747.030:790): pid=5602 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:07.036000 audit[5602]: CRED_ACQ pid=5602 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:07.037893 sshd-session[5602]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:39:07.042902 kernel: audit: type=1103 audit(1768354747.036:791): pid=5602 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:07.042947 kernel: audit: type=1006 audit(1768354747.036:792): pid=5602 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=17 res=1 Jan 14 01:39:07.045023 systemd-logind[1652]: New session 17 of user core. Jan 14 01:39:07.036000 audit[5602]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe0757bc80 a2=3 a3=0 items=0 ppid=1 pid=5602 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:39:07.049910 kernel: audit: type=1300 audit(1768354747.036:792): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe0757bc80 a2=3 a3=0 items=0 ppid=1 pid=5602 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:39:07.049977 kernel: audit: type=1327 audit(1768354747.036:792): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:39:07.036000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:39:07.051070 systemd[1]: Started session-17.scope - Session 17 of User core. Jan 14 01:39:07.053000 audit[5602]: USER_START pid=5602 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:07.055000 audit[5606]: CRED_ACQ pid=5606 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:07.060696 kernel: audit: type=1105 audit(1768354747.053:793): pid=5602 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:07.060763 kernel: audit: type=1103 audit(1768354747.055:794): pid=5606 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:07.400055 sshd[5606]: Connection closed by 68.220.241.50 port 38280 Jan 14 01:39:07.400537 sshd-session[5602]: pam_unix(sshd:session): session closed for user core Jan 14 01:39:07.401000 audit[5602]: USER_END pid=5602 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:07.405194 systemd[1]: sshd@15-10.0.22.183:22-68.220.241.50:38280.service: Deactivated successfully. Jan 14 01:39:07.407816 systemd[1]: session-17.scope: Deactivated successfully. Jan 14 01:39:07.407887 kernel: audit: type=1106 audit(1768354747.401:795): pid=5602 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:07.401000 audit[5602]: CRED_DISP pid=5602 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:07.412759 systemd-logind[1652]: Session 17 logged out. Waiting for processes to exit. Jan 14 01:39:07.413730 systemd-logind[1652]: Removed session 17. Jan 14 01:39:07.414139 kernel: audit: type=1104 audit(1768354747.401:796): pid=5602 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:07.404000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-10.0.22.183:22-68.220.241.50:38280 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:39:08.772320 kubelet[2924]: E0114 01:39:08.772244 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-gknfk" podUID="9e0350f1-074c-4801-8433-6b63afe081c2" Jan 14 01:39:08.776894 kubelet[2924]: E0114 01:39:08.776751 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-fcdf56664-kjljc" podUID="5694b728-96e4-405e-ad55-bbb10255a07e" Jan 14 01:39:09.769823 kubelet[2924]: E0114 01:39:09.769701 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-84b7d46b9c-rw586" podUID="7534bfa4-9af8-4640-8306-673448a61bb0" Jan 14 01:39:09.770173 kubelet[2924]: E0114 01:39:09.769770 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-7rtbs" podUID="327b5d4f-3ff4-43f6-98c0-8969f3f3f2d2" Jan 14 01:39:12.511041 systemd[1]: Started sshd@16-10.0.22.183:22-68.220.241.50:59620.service - OpenSSH per-connection server daemon (68.220.241.50:59620). Jan 14 01:39:12.510000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-10.0.22.183:22-68.220.241.50:59620 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:39:12.512159 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 01:39:12.512202 kernel: audit: type=1130 audit(1768354752.510:798): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-10.0.22.183:22-68.220.241.50:59620 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:39:13.053000 audit[5619]: USER_ACCT pid=5619 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:13.054588 sshd[5619]: Accepted publickey for core from 68.220.241.50 port 59620 ssh2: RSA SHA256:sbFoGtRFAtOHBsi8EGjpgLe6A0GZZFX7Mtyn33eMzVs Jan 14 01:39:13.058232 sshd-session[5619]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:39:13.058959 kernel: audit: type=1101 audit(1768354753.053:799): pid=5619 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:13.056000 audit[5619]: CRED_ACQ pid=5619 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:13.063591 systemd-logind[1652]: New session 18 of user core. Jan 14 01:39:13.065872 kernel: audit: type=1103 audit(1768354753.056:800): pid=5619 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:13.056000 audit[5619]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe3d271270 a2=3 a3=0 items=0 ppid=1 pid=5619 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=18 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:39:13.071351 kernel: audit: type=1006 audit(1768354753.056:801): pid=5619 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=18 res=1 Jan 14 01:39:13.071407 kernel: audit: type=1300 audit(1768354753.056:801): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe3d271270 a2=3 a3=0 items=0 ppid=1 pid=5619 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=18 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:39:13.072035 systemd[1]: Started session-18.scope - Session 18 of User core. Jan 14 01:39:13.056000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:39:13.075000 audit[5619]: USER_START pid=5619 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:13.078421 kernel: audit: type=1327 audit(1768354753.056:801): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:39:13.078478 kernel: audit: type=1105 audit(1768354753.075:802): pid=5619 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:13.081000 audit[5623]: CRED_ACQ pid=5623 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:13.085870 kernel: audit: type=1103 audit(1768354753.081:803): pid=5623 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:13.599186 sshd[5623]: Connection closed by 68.220.241.50 port 59620 Jan 14 01:39:13.601363 sshd-session[5619]: pam_unix(sshd:session): session closed for user core Jan 14 01:39:13.602000 audit[5619]: USER_END pid=5619 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:13.609838 systemd[1]: sshd@16-10.0.22.183:22-68.220.241.50:59620.service: Deactivated successfully. Jan 14 01:39:13.614213 systemd[1]: session-18.scope: Deactivated successfully. Jan 14 01:39:13.614866 kernel: audit: type=1106 audit(1768354753.602:804): pid=5619 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:13.616775 systemd-logind[1652]: Session 18 logged out. Waiting for processes to exit. Jan 14 01:39:13.618721 systemd-logind[1652]: Removed session 18. Jan 14 01:39:13.603000 audit[5619]: CRED_DISP pid=5619 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:13.609000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-10.0.22.183:22-68.220.241.50:59620 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:39:13.625868 kernel: audit: type=1104 audit(1768354753.603:805): pid=5619 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:13.770731 kubelet[2924]: E0114 01:39:13.770641 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-bfb5m" podUID="e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2" Jan 14 01:39:16.772899 kubelet[2924]: E0114 01:39:16.772515 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-qjrqf" podUID="a4130f46-1c6e-474c-9a1c-5fd1820934c0" Jan 14 01:39:18.712337 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 01:39:18.712424 kernel: audit: type=1130 audit(1768354758.709:807): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-10.0.22.183:22-68.220.241.50:59636 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:39:18.709000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-10.0.22.183:22-68.220.241.50:59636 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:39:18.710429 systemd[1]: Started sshd@17-10.0.22.183:22-68.220.241.50:59636.service - OpenSSH per-connection server daemon (68.220.241.50:59636). Jan 14 01:39:19.267000 audit[5639]: USER_ACCT pid=5639 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:19.273302 sshd[5639]: Accepted publickey for core from 68.220.241.50 port 59636 ssh2: RSA SHA256:sbFoGtRFAtOHBsi8EGjpgLe6A0GZZFX7Mtyn33eMzVs Jan 14 01:39:19.273924 kernel: audit: type=1101 audit(1768354759.267:808): pid=5639 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:19.275368 sshd-session[5639]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:39:19.273000 audit[5639]: CRED_ACQ pid=5639 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:19.279872 kernel: audit: type=1103 audit(1768354759.273:809): pid=5639 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:19.273000 audit[5639]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe13529870 a2=3 a3=0 items=0 ppid=1 pid=5639 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=19 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:39:19.286385 kernel: audit: type=1006 audit(1768354759.273:810): pid=5639 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=19 res=1 Jan 14 01:39:19.286422 kernel: audit: type=1300 audit(1768354759.273:810): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe13529870 a2=3 a3=0 items=0 ppid=1 pid=5639 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=19 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:39:19.287785 systemd-logind[1652]: New session 19 of user core. Jan 14 01:39:19.288985 kernel: audit: type=1327 audit(1768354759.273:810): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:39:19.273000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:39:19.294039 systemd[1]: Started session-19.scope - Session 19 of User core. Jan 14 01:39:19.298000 audit[5639]: USER_START pid=5639 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:19.304916 kernel: audit: type=1105 audit(1768354759.298:811): pid=5639 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:19.304000 audit[5643]: CRED_ACQ pid=5643 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:19.310866 kernel: audit: type=1103 audit(1768354759.304:812): pid=5643 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:19.635693 sshd[5643]: Connection closed by 68.220.241.50 port 59636 Jan 14 01:39:19.638326 sshd-session[5639]: pam_unix(sshd:session): session closed for user core Jan 14 01:39:19.639000 audit[5639]: USER_END pid=5639 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:19.644337 systemd[1]: sshd@17-10.0.22.183:22-68.220.241.50:59636.service: Deactivated successfully. Jan 14 01:39:19.639000 audit[5639]: CRED_DISP pid=5639 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:19.646637 kernel: audit: type=1106 audit(1768354759.639:813): pid=5639 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:19.646686 kernel: audit: type=1104 audit(1768354759.639:814): pid=5639 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:19.648256 systemd[1]: session-19.scope: Deactivated successfully. Jan 14 01:39:19.642000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-10.0.22.183:22-68.220.241.50:59636 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:39:19.653083 systemd-logind[1652]: Session 19 logged out. Waiting for processes to exit. Jan 14 01:39:19.653820 systemd-logind[1652]: Removed session 19. Jan 14 01:39:19.769781 kubelet[2924]: E0114 01:39:19.769731 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-fcdf56664-kjljc" podUID="5694b728-96e4-405e-ad55-bbb10255a07e" Jan 14 01:39:20.769446 kubelet[2924]: E0114 01:39:20.769397 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-84b7d46b9c-rw586" podUID="7534bfa4-9af8-4640-8306-673448a61bb0" Jan 14 01:39:20.769855 kubelet[2924]: E0114 01:39:20.769730 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-7rtbs" podUID="327b5d4f-3ff4-43f6-98c0-8969f3f3f2d2" Jan 14 01:39:21.769489 kubelet[2924]: E0114 01:39:21.769235 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-gknfk" podUID="9e0350f1-074c-4801-8433-6b63afe081c2" Jan 14 01:39:24.745272 systemd[1]: Started sshd@18-10.0.22.183:22-68.220.241.50:54194.service - OpenSSH per-connection server daemon (68.220.241.50:54194). Jan 14 01:39:24.744000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-10.0.22.183:22-68.220.241.50:54194 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:39:24.747027 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 01:39:24.747081 kernel: audit: type=1130 audit(1768354764.744:816): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-10.0.22.183:22-68.220.241.50:54194 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:39:24.770160 kubelet[2924]: E0114 01:39:24.769838 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-bfb5m" podUID="e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2" Jan 14 01:39:25.282176 sshd[5657]: Accepted publickey for core from 68.220.241.50 port 54194 ssh2: RSA SHA256:sbFoGtRFAtOHBsi8EGjpgLe6A0GZZFX7Mtyn33eMzVs Jan 14 01:39:25.281000 audit[5657]: USER_ACCT pid=5657 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:25.288855 kernel: audit: type=1101 audit(1768354765.281:817): pid=5657 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:25.288578 sshd-session[5657]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:39:25.286000 audit[5657]: CRED_ACQ pid=5657 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:25.295962 kernel: audit: type=1103 audit(1768354765.286:818): pid=5657 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:25.301923 kernel: audit: type=1006 audit(1768354765.287:819): pid=5657 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=20 res=1 Jan 14 01:39:25.301986 kernel: audit: type=1300 audit(1768354765.287:819): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc5b823f20 a2=3 a3=0 items=0 ppid=1 pid=5657 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=20 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:39:25.287000 audit[5657]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc5b823f20 a2=3 a3=0 items=0 ppid=1 pid=5657 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=20 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:39:25.299577 systemd-logind[1652]: New session 20 of user core. Jan 14 01:39:25.287000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:39:25.307182 kernel: audit: type=1327 audit(1768354765.287:819): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:39:25.308245 systemd[1]: Started session-20.scope - Session 20 of User core. Jan 14 01:39:25.312000 audit[5657]: USER_START pid=5657 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:25.318869 kernel: audit: type=1105 audit(1768354765.312:820): pid=5657 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:25.315000 audit[5661]: CRED_ACQ pid=5661 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:25.323882 kernel: audit: type=1103 audit(1768354765.315:821): pid=5661 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:25.640155 sshd[5661]: Connection closed by 68.220.241.50 port 54194 Jan 14 01:39:25.640018 sshd-session[5657]: pam_unix(sshd:session): session closed for user core Jan 14 01:39:25.640000 audit[5657]: USER_END pid=5657 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:25.647879 kernel: audit: type=1106 audit(1768354765.640:822): pid=5657 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:25.649098 systemd[1]: sshd@18-10.0.22.183:22-68.220.241.50:54194.service: Deactivated successfully. Jan 14 01:39:25.651402 systemd[1]: session-20.scope: Deactivated successfully. Jan 14 01:39:25.640000 audit[5657]: CRED_DISP pid=5657 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:25.658468 kernel: audit: type=1104 audit(1768354765.640:823): pid=5657 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:25.658644 systemd-logind[1652]: Session 20 logged out. Waiting for processes to exit. Jan 14 01:39:25.659703 systemd-logind[1652]: Removed session 20. Jan 14 01:39:25.648000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-10.0.22.183:22-68.220.241.50:54194 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:39:27.769559 kubelet[2924]: E0114 01:39:27.769516 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-qjrqf" podUID="a4130f46-1c6e-474c-9a1c-5fd1820934c0" Jan 14 01:39:30.761017 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 01:39:30.761180 kernel: audit: type=1130 audit(1768354770.757:825): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-10.0.22.183:22-68.220.241.50:54196 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:39:30.757000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-10.0.22.183:22-68.220.241.50:54196 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:39:30.757928 systemd[1]: Started sshd@19-10.0.22.183:22-68.220.241.50:54196.service - OpenSSH per-connection server daemon (68.220.241.50:54196). Jan 14 01:39:30.771889 kubelet[2924]: E0114 01:39:30.771712 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-fcdf56664-kjljc" podUID="5694b728-96e4-405e-ad55-bbb10255a07e" Jan 14 01:39:31.295000 audit[5696]: USER_ACCT pid=5696 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:31.300035 sshd[5696]: Accepted publickey for core from 68.220.241.50 port 54196 ssh2: RSA SHA256:sbFoGtRFAtOHBsi8EGjpgLe6A0GZZFX7Mtyn33eMzVs Jan 14 01:39:31.300000 audit[5696]: CRED_ACQ pid=5696 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:31.302268 sshd-session[5696]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:39:31.302610 kernel: audit: type=1101 audit(1768354771.295:826): pid=5696 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:31.302652 kernel: audit: type=1103 audit(1768354771.300:827): pid=5696 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:31.307860 kernel: audit: type=1006 audit(1768354771.300:828): pid=5696 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=21 res=1 Jan 14 01:39:31.300000 audit[5696]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe0df34550 a2=3 a3=0 items=0 ppid=1 pid=5696 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=21 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:39:31.312782 systemd-logind[1652]: New session 21 of user core. Jan 14 01:39:31.300000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:39:31.317912 kernel: audit: type=1300 audit(1768354771.300:828): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe0df34550 a2=3 a3=0 items=0 ppid=1 pid=5696 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=21 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:39:31.317950 kernel: audit: type=1327 audit(1768354771.300:828): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:39:31.319068 systemd[1]: Started session-21.scope - Session 21 of User core. Jan 14 01:39:31.322000 audit[5696]: USER_START pid=5696 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:31.327892 kernel: audit: type=1105 audit(1768354771.322:829): pid=5696 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:31.324000 audit[5700]: CRED_ACQ pid=5700 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:31.333864 kernel: audit: type=1103 audit(1768354771.324:830): pid=5700 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:31.664374 sshd[5700]: Connection closed by 68.220.241.50 port 54196 Jan 14 01:39:31.664888 sshd-session[5696]: pam_unix(sshd:session): session closed for user core Jan 14 01:39:31.665000 audit[5696]: USER_END pid=5696 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:31.668605 systemd[1]: sshd@19-10.0.22.183:22-68.220.241.50:54196.service: Deactivated successfully. Jan 14 01:39:31.670496 systemd[1]: session-21.scope: Deactivated successfully. Jan 14 01:39:31.672884 kernel: audit: type=1106 audit(1768354771.665:831): pid=5696 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:31.673013 kernel: audit: type=1104 audit(1768354771.665:832): pid=5696 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:31.665000 audit[5696]: CRED_DISP pid=5696 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:31.676436 systemd-logind[1652]: Session 21 logged out. Waiting for processes to exit. Jan 14 01:39:31.666000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-10.0.22.183:22-68.220.241.50:54196 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:39:31.678224 systemd-logind[1652]: Removed session 21. Jan 14 01:39:31.772028 systemd[1]: Started sshd@20-10.0.22.183:22-68.220.241.50:54204.service - OpenSSH per-connection server daemon (68.220.241.50:54204). Jan 14 01:39:31.771000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.0.22.183:22-68.220.241.50:54204 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:39:32.321000 audit[5712]: USER_ACCT pid=5712 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:32.322275 sshd[5712]: Accepted publickey for core from 68.220.241.50 port 54204 ssh2: RSA SHA256:sbFoGtRFAtOHBsi8EGjpgLe6A0GZZFX7Mtyn33eMzVs Jan 14 01:39:32.322000 audit[5712]: CRED_ACQ pid=5712 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:32.322000 audit[5712]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff02f96b60 a2=3 a3=0 items=0 ppid=1 pid=5712 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=22 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:39:32.322000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:39:32.324636 sshd-session[5712]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:39:32.332269 systemd-logind[1652]: New session 22 of user core. Jan 14 01:39:32.336043 systemd[1]: Started session-22.scope - Session 22 of User core. Jan 14 01:39:32.338000 audit[5712]: USER_START pid=5712 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:32.339000 audit[5716]: CRED_ACQ pid=5716 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:32.770699 kubelet[2924]: E0114 01:39:32.770643 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-84b7d46b9c-rw586" podUID="7534bfa4-9af8-4640-8306-673448a61bb0" Jan 14 01:39:33.016519 sshd[5716]: Connection closed by 68.220.241.50 port 54204 Jan 14 01:39:33.017128 sshd-session[5712]: pam_unix(sshd:session): session closed for user core Jan 14 01:39:33.018000 audit[5712]: USER_END pid=5712 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:33.018000 audit[5712]: CRED_DISP pid=5712 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:33.021735 systemd-logind[1652]: Session 22 logged out. Waiting for processes to exit. Jan 14 01:39:33.022034 systemd[1]: sshd@20-10.0.22.183:22-68.220.241.50:54204.service: Deactivated successfully. Jan 14 01:39:33.021000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.0.22.183:22-68.220.241.50:54204 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:39:33.023746 systemd[1]: session-22.scope: Deactivated successfully. Jan 14 01:39:33.025346 systemd-logind[1652]: Removed session 22. Jan 14 01:39:33.129271 systemd[1]: Started sshd@21-10.0.22.183:22-68.220.241.50:39354.service - OpenSSH per-connection server daemon (68.220.241.50:39354). Jan 14 01:39:33.128000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-10.0.22.183:22-68.220.241.50:39354 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:39:33.671000 audit[5726]: USER_ACCT pid=5726 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:33.672261 sshd[5726]: Accepted publickey for core from 68.220.241.50 port 39354 ssh2: RSA SHA256:sbFoGtRFAtOHBsi8EGjpgLe6A0GZZFX7Mtyn33eMzVs Jan 14 01:39:33.672000 audit[5726]: CRED_ACQ pid=5726 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:33.673000 audit[5726]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd3c589a20 a2=3 a3=0 items=0 ppid=1 pid=5726 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=23 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:39:33.673000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:39:33.674611 sshd-session[5726]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:39:33.680900 systemd-logind[1652]: New session 23 of user core. Jan 14 01:39:33.684033 systemd[1]: Started session-23.scope - Session 23 of User core. Jan 14 01:39:33.687000 audit[5726]: USER_START pid=5726 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:33.690000 audit[5730]: CRED_ACQ pid=5730 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:34.501000 audit[5758]: NETFILTER_CFG table=filter:135 family=2 entries=26 op=nft_register_rule pid=5758 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:39:34.501000 audit[5758]: SYSCALL arch=c000003e syscall=46 success=yes exit=14176 a0=3 a1=7ffe3bc96540 a2=0 a3=7ffe3bc9652c items=0 ppid=3028 pid=5758 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:39:34.501000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:39:34.508000 audit[5758]: NETFILTER_CFG table=nat:136 family=2 entries=20 op=nft_register_rule pid=5758 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:39:34.508000 audit[5758]: SYSCALL arch=c000003e syscall=46 success=yes exit=5772 a0=3 a1=7ffe3bc96540 a2=0 a3=0 items=0 ppid=3028 pid=5758 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:39:34.508000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:39:34.537000 audit[5760]: NETFILTER_CFG table=filter:137 family=2 entries=38 op=nft_register_rule pid=5760 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:39:34.537000 audit[5760]: SYSCALL arch=c000003e syscall=46 success=yes exit=14176 a0=3 a1=7ffd12ee4bf0 a2=0 a3=7ffd12ee4bdc items=0 ppid=3028 pid=5760 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:39:34.537000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:39:34.542000 audit[5760]: NETFILTER_CFG table=nat:138 family=2 entries=20 op=nft_register_rule pid=5760 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:39:34.542000 audit[5760]: SYSCALL arch=c000003e syscall=46 success=yes exit=5772 a0=3 a1=7ffd12ee4bf0 a2=0 a3=0 items=0 ppid=3028 pid=5760 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:39:34.542000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:39:34.595057 sshd[5730]: Connection closed by 68.220.241.50 port 39354 Jan 14 01:39:34.594943 sshd-session[5726]: pam_unix(sshd:session): session closed for user core Jan 14 01:39:34.597000 audit[5726]: USER_END pid=5726 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:34.597000 audit[5726]: CRED_DISP pid=5726 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:34.603214 systemd[1]: sshd@21-10.0.22.183:22-68.220.241.50:39354.service: Deactivated successfully. Jan 14 01:39:34.602000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-10.0.22.183:22-68.220.241.50:39354 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:39:34.605412 systemd[1]: session-23.scope: Deactivated successfully. Jan 14 01:39:34.606417 systemd-logind[1652]: Session 23 logged out. Waiting for processes to exit. Jan 14 01:39:34.608198 systemd-logind[1652]: Removed session 23. Jan 14 01:39:34.702000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-10.0.22.183:22-68.220.241.50:39360 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:39:34.703003 systemd[1]: Started sshd@22-10.0.22.183:22-68.220.241.50:39360.service - OpenSSH per-connection server daemon (68.220.241.50:39360). Jan 14 01:39:35.248000 audit[5765]: USER_ACCT pid=5765 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:35.250294 sshd[5765]: Accepted publickey for core from 68.220.241.50 port 39360 ssh2: RSA SHA256:sbFoGtRFAtOHBsi8EGjpgLe6A0GZZFX7Mtyn33eMzVs Jan 14 01:39:35.250000 audit[5765]: CRED_ACQ pid=5765 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:35.250000 audit[5765]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe14d702d0 a2=3 a3=0 items=0 ppid=1 pid=5765 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=24 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:39:35.250000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:39:35.252170 sshd-session[5765]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:39:35.258193 systemd-logind[1652]: New session 24 of user core. Jan 14 01:39:35.266019 systemd[1]: Started session-24.scope - Session 24 of User core. Jan 14 01:39:35.268000 audit[5765]: USER_START pid=5765 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:35.270000 audit[5769]: CRED_ACQ pid=5769 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:35.751910 sshd[5769]: Connection closed by 68.220.241.50 port 39360 Jan 14 01:39:35.752465 sshd-session[5765]: pam_unix(sshd:session): session closed for user core Jan 14 01:39:35.754000 audit[5765]: USER_END pid=5765 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:35.754000 audit[5765]: CRED_DISP pid=5765 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:35.757613 systemd-logind[1652]: Session 24 logged out. Waiting for processes to exit. Jan 14 01:39:35.757000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-10.0.22.183:22-68.220.241.50:39360 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:39:35.758403 systemd[1]: sshd@22-10.0.22.183:22-68.220.241.50:39360.service: Deactivated successfully. Jan 14 01:39:35.761611 systemd[1]: session-24.scope: Deactivated successfully. Jan 14 01:39:35.763249 systemd-logind[1652]: Removed session 24. Jan 14 01:39:35.769246 kubelet[2924]: E0114 01:39:35.769143 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-7rtbs" podUID="327b5d4f-3ff4-43f6-98c0-8969f3f3f2d2" Jan 14 01:39:35.769910 kubelet[2924]: E0114 01:39:35.769662 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-gknfk" podUID="9e0350f1-074c-4801-8433-6b63afe081c2" Jan 14 01:39:35.864512 kernel: kauditd_printk_skb: 46 callbacks suppressed Jan 14 01:39:35.864602 kernel: audit: type=1130 audit(1768354775.857:865): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-10.0.22.183:22-68.220.241.50:39368 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:39:35.857000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-10.0.22.183:22-68.220.241.50:39368 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:39:35.858409 systemd[1]: Started sshd@23-10.0.22.183:22-68.220.241.50:39368.service - OpenSSH per-connection server daemon (68.220.241.50:39368). Jan 14 01:39:36.399000 audit[5781]: USER_ACCT pid=5781 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:36.401013 sshd[5781]: Accepted publickey for core from 68.220.241.50 port 39368 ssh2: RSA SHA256:sbFoGtRFAtOHBsi8EGjpgLe6A0GZZFX7Mtyn33eMzVs Jan 14 01:39:36.402000 audit[5781]: CRED_ACQ pid=5781 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:36.404816 sshd-session[5781]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:39:36.405989 kernel: audit: type=1101 audit(1768354776.399:866): pid=5781 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:36.406041 kernel: audit: type=1103 audit(1768354776.402:867): pid=5781 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:36.409523 kernel: audit: type=1006 audit(1768354776.403:868): pid=5781 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=25 res=1 Jan 14 01:39:36.410970 kernel: audit: type=1300 audit(1768354776.403:868): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff4d618170 a2=3 a3=0 items=0 ppid=1 pid=5781 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=25 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:39:36.403000 audit[5781]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff4d618170 a2=3 a3=0 items=0 ppid=1 pid=5781 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=25 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:39:36.403000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:39:36.415870 kernel: audit: type=1327 audit(1768354776.403:868): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:39:36.415468 systemd-logind[1652]: New session 25 of user core. Jan 14 01:39:36.423009 systemd[1]: Started session-25.scope - Session 25 of User core. Jan 14 01:39:36.428000 audit[5781]: USER_START pid=5781 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:36.434864 kernel: audit: type=1105 audit(1768354776.428:869): pid=5781 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:36.434000 audit[5785]: CRED_ACQ pid=5785 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:36.439249 kernel: audit: type=1103 audit(1768354776.434:870): pid=5785 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:36.793889 sshd[5785]: Connection closed by 68.220.241.50 port 39368 Jan 14 01:39:36.795068 sshd-session[5781]: pam_unix(sshd:session): session closed for user core Jan 14 01:39:36.795000 audit[5781]: USER_END pid=5781 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:36.803982 kernel: audit: type=1106 audit(1768354776.795:871): pid=5781 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:36.804930 systemd[1]: sshd@23-10.0.22.183:22-68.220.241.50:39368.service: Deactivated successfully. Jan 14 01:39:36.807929 systemd[1]: session-25.scope: Deactivated successfully. Jan 14 01:39:36.795000 audit[5781]: CRED_DISP pid=5781 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:36.811885 kernel: audit: type=1104 audit(1768354776.795:872): pid=5781 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:36.804000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-10.0.22.183:22-68.220.241.50:39368 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:39:36.813077 systemd-logind[1652]: Session 25 logged out. Waiting for processes to exit. Jan 14 01:39:36.815302 systemd-logind[1652]: Removed session 25. Jan 14 01:39:39.769675 kubelet[2924]: E0114 01:39:39.769630 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-bfb5m" podUID="e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2" Jan 14 01:39:39.863000 audit[5796]: NETFILTER_CFG table=filter:139 family=2 entries=26 op=nft_register_rule pid=5796 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:39:39.863000 audit[5796]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7ffe92e00c00 a2=0 a3=7ffe92e00bec items=0 ppid=3028 pid=5796 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:39:39.863000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:39:39.872000 audit[5796]: NETFILTER_CFG table=nat:140 family=2 entries=104 op=nft_register_chain pid=5796 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:39:39.872000 audit[5796]: SYSCALL arch=c000003e syscall=46 success=yes exit=48684 a0=3 a1=7ffe92e00c00 a2=0 a3=7ffe92e00bec items=0 ppid=3028 pid=5796 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:39:39.872000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:39:40.770076 kubelet[2924]: E0114 01:39:40.769865 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-qjrqf" podUID="a4130f46-1c6e-474c-9a1c-5fd1820934c0" Jan 14 01:39:41.906000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-10.0.22.183:22-68.220.241.50:39384 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:39:41.907337 systemd[1]: Started sshd@24-10.0.22.183:22-68.220.241.50:39384.service - OpenSSH per-connection server daemon (68.220.241.50:39384). Jan 14 01:39:41.909238 kernel: kauditd_printk_skb: 7 callbacks suppressed Jan 14 01:39:41.909272 kernel: audit: type=1130 audit(1768354781.906:876): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-10.0.22.183:22-68.220.241.50:39384 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:39:42.445000 audit[5798]: USER_ACCT pid=5798 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:42.452579 sshd-session[5798]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:39:42.453758 sshd[5798]: Accepted publickey for core from 68.220.241.50 port 39384 ssh2: RSA SHA256:sbFoGtRFAtOHBsi8EGjpgLe6A0GZZFX7Mtyn33eMzVs Jan 14 01:39:42.454630 kernel: audit: type=1101 audit(1768354782.445:877): pid=5798 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:42.454728 kernel: audit: type=1103 audit(1768354782.450:878): pid=5798 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:42.450000 audit[5798]: CRED_ACQ pid=5798 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:42.460974 kernel: audit: type=1006 audit(1768354782.450:879): pid=5798 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=26 res=1 Jan 14 01:39:42.450000 audit[5798]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc6005de40 a2=3 a3=0 items=0 ppid=1 pid=5798 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=26 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:39:42.467686 kernel: audit: type=1300 audit(1768354782.450:879): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc6005de40 a2=3 a3=0 items=0 ppid=1 pid=5798 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=26 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:39:42.470663 systemd-logind[1652]: New session 26 of user core. Jan 14 01:39:42.450000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:39:42.477878 kernel: audit: type=1327 audit(1768354782.450:879): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:39:42.483165 systemd[1]: Started session-26.scope - Session 26 of User core. Jan 14 01:39:42.488000 audit[5798]: USER_START pid=5798 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:42.496894 kernel: audit: type=1105 audit(1768354782.488:880): pid=5798 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:42.496000 audit[5802]: CRED_ACQ pid=5802 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:42.503875 kernel: audit: type=1103 audit(1768354782.496:881): pid=5802 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:42.807173 sshd[5802]: Connection closed by 68.220.241.50 port 39384 Jan 14 01:39:42.809522 sshd-session[5798]: pam_unix(sshd:session): session closed for user core Jan 14 01:39:42.810000 audit[5798]: USER_END pid=5798 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:42.813820 systemd-logind[1652]: Session 26 logged out. Waiting for processes to exit. Jan 14 01:39:42.814093 systemd[1]: sshd@24-10.0.22.183:22-68.220.241.50:39384.service: Deactivated successfully. Jan 14 01:39:42.816864 kernel: audit: type=1106 audit(1768354782.810:882): pid=5798 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:42.817470 systemd[1]: session-26.scope: Deactivated successfully. Jan 14 01:39:42.810000 audit[5798]: CRED_DISP pid=5798 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:42.820019 systemd-logind[1652]: Removed session 26. Jan 14 01:39:42.823864 kernel: audit: type=1104 audit(1768354782.810:883): pid=5798 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:42.813000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-10.0.22.183:22-68.220.241.50:39384 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:39:44.770825 kubelet[2924]: E0114 01:39:44.770784 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-fcdf56664-kjljc" podUID="5694b728-96e4-405e-ad55-bbb10255a07e" Jan 14 01:39:46.770208 kubelet[2924]: E0114 01:39:46.769941 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-84b7d46b9c-rw586" podUID="7534bfa4-9af8-4640-8306-673448a61bb0" Jan 14 01:39:47.769557 kubelet[2924]: E0114 01:39:47.769517 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-gknfk" podUID="9e0350f1-074c-4801-8433-6b63afe081c2" Jan 14 01:39:47.920786 systemd[1]: Started sshd@25-10.0.22.183:22-68.220.241.50:45210.service - OpenSSH per-connection server daemon (68.220.241.50:45210). Jan 14 01:39:47.920000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@25-10.0.22.183:22-68.220.241.50:45210 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:39:47.922558 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 01:39:47.922606 kernel: audit: type=1130 audit(1768354787.920:885): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@25-10.0.22.183:22-68.220.241.50:45210 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:39:48.477000 audit[5816]: USER_ACCT pid=5816 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:48.479565 sshd[5816]: Accepted publickey for core from 68.220.241.50 port 45210 ssh2: RSA SHA256:sbFoGtRFAtOHBsi8EGjpgLe6A0GZZFX7Mtyn33eMzVs Jan 14 01:39:48.481296 sshd-session[5816]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:39:48.479000 audit[5816]: CRED_ACQ pid=5816 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:48.484798 kernel: audit: type=1101 audit(1768354788.477:886): pid=5816 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:48.484917 kernel: audit: type=1103 audit(1768354788.479:887): pid=5816 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:48.487905 systemd-logind[1652]: New session 27 of user core. Jan 14 01:39:48.488907 kernel: audit: type=1006 audit(1768354788.479:888): pid=5816 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=27 res=1 Jan 14 01:39:48.479000 audit[5816]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe6e172db0 a2=3 a3=0 items=0 ppid=1 pid=5816 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=27 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:39:48.492054 kernel: audit: type=1300 audit(1768354788.479:888): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe6e172db0 a2=3 a3=0 items=0 ppid=1 pid=5816 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=27 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:39:48.479000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:39:48.495234 kernel: audit: type=1327 audit(1768354788.479:888): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:39:48.496097 systemd[1]: Started session-27.scope - Session 27 of User core. Jan 14 01:39:48.498000 audit[5816]: USER_START pid=5816 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:48.501000 audit[5820]: CRED_ACQ pid=5820 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:48.506130 kernel: audit: type=1105 audit(1768354788.498:889): pid=5816 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:48.506241 kernel: audit: type=1103 audit(1768354788.501:890): pid=5820 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:48.866216 sshd[5820]: Connection closed by 68.220.241.50 port 45210 Jan 14 01:39:48.866759 sshd-session[5816]: pam_unix(sshd:session): session closed for user core Jan 14 01:39:48.867000 audit[5816]: USER_END pid=5816 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:48.870723 systemd-logind[1652]: Session 27 logged out. Waiting for processes to exit. Jan 14 01:39:48.871495 systemd[1]: sshd@25-10.0.22.183:22-68.220.241.50:45210.service: Deactivated successfully. Jan 14 01:39:48.867000 audit[5816]: CRED_DISP pid=5816 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:48.874298 systemd[1]: session-27.scope: Deactivated successfully. Jan 14 01:39:48.874628 kernel: audit: type=1106 audit(1768354788.867:891): pid=5816 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:48.874686 kernel: audit: type=1104 audit(1768354788.867:892): pid=5816 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:48.870000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@25-10.0.22.183:22-68.220.241.50:45210 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:39:48.878161 systemd-logind[1652]: Removed session 27. Jan 14 01:39:49.770557 kubelet[2924]: E0114 01:39:49.770289 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-7rtbs" podUID="327b5d4f-3ff4-43f6-98c0-8969f3f3f2d2" Jan 14 01:39:51.769821 kubelet[2924]: E0114 01:39:51.769607 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-qjrqf" podUID="a4130f46-1c6e-474c-9a1c-5fd1820934c0" Jan 14 01:39:53.771214 kubelet[2924]: E0114 01:39:53.771159 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-bfb5m" podUID="e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2" Jan 14 01:39:53.982191 systemd[1]: Started sshd@26-10.0.22.183:22-68.220.241.50:39472.service - OpenSSH per-connection server daemon (68.220.241.50:39472). Jan 14 01:39:53.990011 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 01:39:53.990044 kernel: audit: type=1130 audit(1768354793.981:894): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@26-10.0.22.183:22-68.220.241.50:39472 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:39:53.981000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@26-10.0.22.183:22-68.220.241.50:39472 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:39:54.543000 audit[5835]: USER_ACCT pid=5835 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:54.547109 sshd-session[5835]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:39:54.550532 kernel: audit: type=1101 audit(1768354794.543:895): pid=5835 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:54.550562 sshd[5835]: Accepted publickey for core from 68.220.241.50 port 39472 ssh2: RSA SHA256:sbFoGtRFAtOHBsi8EGjpgLe6A0GZZFX7Mtyn33eMzVs Jan 14 01:39:54.555950 kernel: audit: type=1103 audit(1768354794.545:896): pid=5835 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:54.545000 audit[5835]: CRED_ACQ pid=5835 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:54.554620 systemd-logind[1652]: New session 28 of user core. Jan 14 01:39:54.545000 audit[5835]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffcb5e4bd00 a2=3 a3=0 items=0 ppid=1 pid=5835 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=28 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:39:54.561175 kernel: audit: type=1006 audit(1768354794.545:897): pid=5835 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=28 res=1 Jan 14 01:39:54.561209 kernel: audit: type=1300 audit(1768354794.545:897): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffcb5e4bd00 a2=3 a3=0 items=0 ppid=1 pid=5835 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=28 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:39:54.562023 systemd[1]: Started session-28.scope - Session 28 of User core. Jan 14 01:39:54.564191 kernel: audit: type=1327 audit(1768354794.545:897): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:39:54.545000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:39:54.566000 audit[5835]: USER_START pid=5835 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:54.569000 audit[5839]: CRED_ACQ pid=5839 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:54.573887 kernel: audit: type=1105 audit(1768354794.566:898): pid=5835 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:54.573935 kernel: audit: type=1103 audit(1768354794.569:899): pid=5839 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:54.910022 sshd[5839]: Connection closed by 68.220.241.50 port 39472 Jan 14 01:39:54.910501 sshd-session[5835]: pam_unix(sshd:session): session closed for user core Jan 14 01:39:54.911000 audit[5835]: USER_END pid=5835 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:54.915424 systemd[1]: sshd@26-10.0.22.183:22-68.220.241.50:39472.service: Deactivated successfully. Jan 14 01:39:54.915619 systemd-logind[1652]: Session 28 logged out. Waiting for processes to exit. Jan 14 01:39:54.917952 kernel: audit: type=1106 audit(1768354794.911:900): pid=5835 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:54.918644 systemd[1]: session-28.scope: Deactivated successfully. Jan 14 01:39:54.911000 audit[5835]: CRED_DISP pid=5835 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:54.922810 systemd-logind[1652]: Removed session 28. Jan 14 01:39:54.925868 kernel: audit: type=1104 audit(1768354794.911:901): pid=5835 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:39:54.913000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@26-10.0.22.183:22-68.220.241.50:39472 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:39:55.770214 kubelet[2924]: E0114 01:39:55.769974 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-fcdf56664-kjljc" podUID="5694b728-96e4-405e-ad55-bbb10255a07e" Jan 14 01:39:59.769913 kubelet[2924]: E0114 01:39:59.769598 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-84b7d46b9c-rw586" podUID="7534bfa4-9af8-4640-8306-673448a61bb0" Jan 14 01:39:59.770332 kubelet[2924]: E0114 01:39:59.770025 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-gknfk" podUID="9e0350f1-074c-4801-8433-6b63afe081c2" Jan 14 01:40:00.023307 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 01:40:00.023406 kernel: audit: type=1130 audit(1768354800.019:903): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@27-10.0.22.183:22-68.220.241.50:39474 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:00.019000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@27-10.0.22.183:22-68.220.241.50:39474 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:00.021194 systemd[1]: Started sshd@27-10.0.22.183:22-68.220.241.50:39474.service - OpenSSH per-connection server daemon (68.220.241.50:39474). Jan 14 01:40:00.575000 audit[5874]: USER_ACCT pid=5874 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:40:00.577994 sshd[5874]: Accepted publickey for core from 68.220.241.50 port 39474 ssh2: RSA SHA256:sbFoGtRFAtOHBsi8EGjpgLe6A0GZZFX7Mtyn33eMzVs Jan 14 01:40:00.581874 kernel: audit: type=1101 audit(1768354800.575:904): pid=5874 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:40:00.580000 audit[5874]: CRED_ACQ pid=5874 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:40:00.584546 sshd-session[5874]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:40:00.587995 kernel: audit: type=1103 audit(1768354800.580:905): pid=5874 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:40:00.588049 kernel: audit: type=1006 audit(1768354800.580:906): pid=5874 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=29 res=1 Jan 14 01:40:00.580000 audit[5874]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff3734a890 a2=3 a3=0 items=0 ppid=1 pid=5874 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=29 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:00.593801 kernel: audit: type=1300 audit(1768354800.580:906): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff3734a890 a2=3 a3=0 items=0 ppid=1 pid=5874 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=29 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:00.594999 kernel: audit: type=1327 audit(1768354800.580:906): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:40:00.580000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:40:00.594703 systemd-logind[1652]: New session 29 of user core. Jan 14 01:40:00.599117 systemd[1]: Started session-29.scope - Session 29 of User core. Jan 14 01:40:00.601000 audit[5874]: USER_START pid=5874 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:40:00.606000 audit[5878]: CRED_ACQ pid=5878 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:40:00.609938 kernel: audit: type=1105 audit(1768354800.601:907): pid=5874 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:40:00.610001 kernel: audit: type=1103 audit(1768354800.606:908): pid=5878 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:40:00.972385 sshd[5878]: Connection closed by 68.220.241.50 port 39474 Jan 14 01:40:00.974044 sshd-session[5874]: pam_unix(sshd:session): session closed for user core Jan 14 01:40:00.974000 audit[5874]: USER_END pid=5874 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:40:00.978617 systemd[1]: sshd@27-10.0.22.183:22-68.220.241.50:39474.service: Deactivated successfully. Jan 14 01:40:00.980741 systemd[1]: session-29.scope: Deactivated successfully. Jan 14 01:40:00.982060 kernel: audit: type=1106 audit(1768354800.974:909): pid=5874 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:40:00.982563 systemd-logind[1652]: Session 29 logged out. Waiting for processes to exit. Jan 14 01:40:00.983881 systemd-logind[1652]: Removed session 29. Jan 14 01:40:00.974000 audit[5874]: CRED_DISP pid=5874 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:40:00.977000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@27-10.0.22.183:22-68.220.241.50:39474 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:00.989225 kernel: audit: type=1104 audit(1768354800.974:910): pid=5874 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:40:03.769541 kubelet[2924]: E0114 01:40:03.769465 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-qjrqf" podUID="a4130f46-1c6e-474c-9a1c-5fd1820934c0" Jan 14 01:40:04.769154 kubelet[2924]: E0114 01:40:04.768962 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-7rtbs" podUID="327b5d4f-3ff4-43f6-98c0-8969f3f3f2d2" Jan 14 01:40:06.081942 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 01:40:06.082047 kernel: audit: type=1130 audit(1768354806.078:912): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@28-10.0.22.183:22-68.220.241.50:35442 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:06.078000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@28-10.0.22.183:22-68.220.241.50:35442 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:06.080536 systemd[1]: Started sshd@28-10.0.22.183:22-68.220.241.50:35442.service - OpenSSH per-connection server daemon (68.220.241.50:35442). Jan 14 01:40:06.614000 audit[5890]: USER_ACCT pid=5890 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:40:06.617220 sshd[5890]: Accepted publickey for core from 68.220.241.50 port 35442 ssh2: RSA SHA256:sbFoGtRFAtOHBsi8EGjpgLe6A0GZZFX7Mtyn33eMzVs Jan 14 01:40:06.619882 sshd-session[5890]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:40:06.624156 kernel: audit: type=1101 audit(1768354806.614:913): pid=5890 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:40:06.624281 kernel: audit: type=1103 audit(1768354806.616:914): pid=5890 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:40:06.616000 audit[5890]: CRED_ACQ pid=5890 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:40:06.632579 kernel: audit: type=1006 audit(1768354806.616:915): pid=5890 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=30 res=1 Jan 14 01:40:06.616000 audit[5890]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe33b6f5c0 a2=3 a3=0 items=0 ppid=1 pid=5890 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=30 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:06.640683 kernel: audit: type=1300 audit(1768354806.616:915): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe33b6f5c0 a2=3 a3=0 items=0 ppid=1 pid=5890 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=30 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:06.637661 systemd-logind[1652]: New session 30 of user core. Jan 14 01:40:06.616000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:40:06.642499 kernel: audit: type=1327 audit(1768354806.616:915): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:40:06.645064 systemd[1]: Started session-30.scope - Session 30 of User core. Jan 14 01:40:06.648000 audit[5890]: USER_START pid=5890 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:40:06.651000 audit[5894]: CRED_ACQ pid=5894 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:40:06.658472 kernel: audit: type=1105 audit(1768354806.648:916): pid=5890 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:40:06.658550 kernel: audit: type=1103 audit(1768354806.651:917): pid=5894 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:40:06.785506 kubelet[2924]: E0114 01:40:06.785424 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-fcdf56664-kjljc" podUID="5694b728-96e4-405e-ad55-bbb10255a07e" Jan 14 01:40:06.787857 kubelet[2924]: E0114 01:40:06.786988 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-bfb5m" podUID="e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2" Jan 14 01:40:06.976508 sshd[5894]: Connection closed by 68.220.241.50 port 35442 Jan 14 01:40:06.978739 sshd-session[5890]: pam_unix(sshd:session): session closed for user core Jan 14 01:40:06.978000 audit[5890]: USER_END pid=5890 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:40:06.983823 systemd[1]: sshd@28-10.0.22.183:22-68.220.241.50:35442.service: Deactivated successfully. Jan 14 01:40:06.985497 systemd[1]: session-30.scope: Deactivated successfully. Jan 14 01:40:06.986232 kernel: audit: type=1106 audit(1768354806.978:918): pid=5890 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:40:06.978000 audit[5890]: CRED_DISP pid=5890 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:40:06.988548 systemd-logind[1652]: Session 30 logged out. Waiting for processes to exit. Jan 14 01:40:06.991894 kernel: audit: type=1104 audit(1768354806.978:919): pid=5890 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:40:06.982000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@28-10.0.22.183:22-68.220.241.50:35442 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:06.992282 systemd-logind[1652]: Removed session 30. Jan 14 01:40:10.770953 kubelet[2924]: E0114 01:40:10.770885 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-gknfk" podUID="9e0350f1-074c-4801-8433-6b63afe081c2" Jan 14 01:40:12.772029 kubelet[2924]: E0114 01:40:12.771979 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-84b7d46b9c-rw586" podUID="7534bfa4-9af8-4640-8306-673448a61bb0" Jan 14 01:40:15.770116 kubelet[2924]: E0114 01:40:15.769971 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-7rtbs" podUID="327b5d4f-3ff4-43f6-98c0-8969f3f3f2d2" Jan 14 01:40:17.771180 kubelet[2924]: E0114 01:40:17.770995 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-bfb5m" podUID="e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2" Jan 14 01:40:18.770213 kubelet[2924]: E0114 01:40:18.770060 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-qjrqf" podUID="a4130f46-1c6e-474c-9a1c-5fd1820934c0" Jan 14 01:40:21.770165 kubelet[2924]: E0114 01:40:21.770109 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-fcdf56664-kjljc" podUID="5694b728-96e4-405e-ad55-bbb10255a07e" Jan 14 01:40:23.770107 kubelet[2924]: E0114 01:40:23.769534 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-gknfk" podUID="9e0350f1-074c-4801-8433-6b63afe081c2" Jan 14 01:40:26.772038 kubelet[2924]: E0114 01:40:26.771587 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-7rtbs" podUID="327b5d4f-3ff4-43f6-98c0-8969f3f3f2d2" Jan 14 01:40:27.769719 kubelet[2924]: E0114 01:40:27.769626 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-84b7d46b9c-rw586" podUID="7534bfa4-9af8-4640-8306-673448a61bb0" Jan 14 01:40:30.771408 kubelet[2924]: E0114 01:40:30.771363 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-bfb5m" podUID="e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2" Jan 14 01:40:32.771717 kubelet[2924]: E0114 01:40:32.770884 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-qjrqf" podUID="a4130f46-1c6e-474c-9a1c-5fd1820934c0" Jan 14 01:40:34.870741 systemd[1]: cri-containerd-958152e2a9b58113f9267aa26199d633f3011875c049b42bfa1ccea1bb39c893.scope: Deactivated successfully. Jan 14 01:40:34.871537 systemd[1]: cri-containerd-958152e2a9b58113f9267aa26199d633f3011875c049b42bfa1ccea1bb39c893.scope: Consumed 5.213s CPU time, 58M memory peak, 344K read from disk. Jan 14 01:40:34.871000 audit: BPF prog-id=256 op=LOAD Jan 14 01:40:34.873179 containerd[1684]: time="2026-01-14T01:40:34.872989687Z" level=info msg="received container exit event container_id:\"958152e2a9b58113f9267aa26199d633f3011875c049b42bfa1ccea1bb39c893\" id:\"958152e2a9b58113f9267aa26199d633f3011875c049b42bfa1ccea1bb39c893\" pid:2782 exit_status:1 exited_at:{seconds:1768354834 nanos:871832780}" Jan 14 01:40:34.874453 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 01:40:34.874512 kernel: audit: type=1334 audit(1768354834.871:921): prog-id=256 op=LOAD Jan 14 01:40:34.871000 audit: BPF prog-id=93 op=UNLOAD Jan 14 01:40:34.877000 audit: BPF prog-id=108 op=UNLOAD Jan 14 01:40:34.879070 kernel: audit: type=1334 audit(1768354834.871:922): prog-id=93 op=UNLOAD Jan 14 01:40:34.879109 kernel: audit: type=1334 audit(1768354834.877:923): prog-id=108 op=UNLOAD Jan 14 01:40:34.877000 audit: BPF prog-id=112 op=UNLOAD Jan 14 01:40:34.880589 kernel: audit: type=1334 audit(1768354834.877:924): prog-id=112 op=UNLOAD Jan 14 01:40:34.900597 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-958152e2a9b58113f9267aa26199d633f3011875c049b42bfa1ccea1bb39c893-rootfs.mount: Deactivated successfully. Jan 14 01:40:35.037536 kubelet[2924]: I0114 01:40:35.037504 2924 scope.go:117] "RemoveContainer" containerID="958152e2a9b58113f9267aa26199d633f3011875c049b42bfa1ccea1bb39c893" Jan 14 01:40:35.062879 containerd[1684]: time="2026-01-14T01:40:35.062762150Z" level=info msg="CreateContainer within sandbox \"f5006eae0a1916432390383d5fa63d8a2cbee1e3dc9dae2a84a3dd1d6b8cad3a\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Jan 14 01:40:35.079204 containerd[1684]: time="2026-01-14T01:40:35.078393418Z" level=info msg="Container 86dd35ef087152b995619ac33785094422c7577f78a8c087d90a30f8915d63f4: CDI devices from CRI Config.CDIDevices: []" Jan 14 01:40:35.090315 containerd[1684]: time="2026-01-14T01:40:35.090278668Z" level=info msg="CreateContainer within sandbox \"f5006eae0a1916432390383d5fa63d8a2cbee1e3dc9dae2a84a3dd1d6b8cad3a\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"86dd35ef087152b995619ac33785094422c7577f78a8c087d90a30f8915d63f4\"" Jan 14 01:40:35.091155 containerd[1684]: time="2026-01-14T01:40:35.090983539Z" level=info msg="StartContainer for \"86dd35ef087152b995619ac33785094422c7577f78a8c087d90a30f8915d63f4\"" Jan 14 01:40:35.092241 containerd[1684]: time="2026-01-14T01:40:35.092207292Z" level=info msg="connecting to shim 86dd35ef087152b995619ac33785094422c7577f78a8c087d90a30f8915d63f4" address="unix:///run/containerd/s/60b3286c09c1c34d2c4a71b1f2a8dc3d092805b02167a274ad4ba3c8e99b8fc2" protocol=ttrpc version=3 Jan 14 01:40:35.118094 systemd[1]: Started cri-containerd-86dd35ef087152b995619ac33785094422c7577f78a8c087d90a30f8915d63f4.scope - libcontainer container 86dd35ef087152b995619ac33785094422c7577f78a8c087d90a30f8915d63f4. Jan 14 01:40:35.130000 audit: BPF prog-id=257 op=LOAD Jan 14 01:40:35.132915 kernel: audit: type=1334 audit(1768354835.130:925): prog-id=257 op=LOAD Jan 14 01:40:35.133007 kernel: audit: type=1334 audit(1768354835.132:926): prog-id=258 op=LOAD Jan 14 01:40:35.132000 audit: BPF prog-id=258 op=LOAD Jan 14 01:40:35.132000 audit[5962]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=2624 pid=5962 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:35.132000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3836646433356566303837313532623939353631396163333337383530 Jan 14 01:40:35.141778 kernel: audit: type=1300 audit(1768354835.132:926): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=2624 pid=5962 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:35.141868 kernel: audit: type=1327 audit(1768354835.132:926): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3836646433356566303837313532623939353631396163333337383530 Jan 14 01:40:35.133000 audit: BPF prog-id=258 op=UNLOAD Jan 14 01:40:35.145105 kernel: audit: type=1334 audit(1768354835.133:927): prog-id=258 op=UNLOAD Jan 14 01:40:35.133000 audit[5962]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2624 pid=5962 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:35.133000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3836646433356566303837313532623939353631396163333337383530 Jan 14 01:40:35.133000 audit: BPF prog-id=259 op=LOAD Jan 14 01:40:35.133000 audit[5962]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=2624 pid=5962 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:35.133000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3836646433356566303837313532623939353631396163333337383530 Jan 14 01:40:35.133000 audit: BPF prog-id=260 op=LOAD Jan 14 01:40:35.133000 audit[5962]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=2624 pid=5962 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:35.133000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3836646433356566303837313532623939353631396163333337383530 Jan 14 01:40:35.133000 audit: BPF prog-id=260 op=UNLOAD Jan 14 01:40:35.133000 audit[5962]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2624 pid=5962 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:35.150923 kernel: audit: type=1300 audit(1768354835.133:927): arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2624 pid=5962 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:35.133000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3836646433356566303837313532623939353631396163333337383530 Jan 14 01:40:35.133000 audit: BPF prog-id=259 op=UNLOAD Jan 14 01:40:35.133000 audit[5962]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2624 pid=5962 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:35.133000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3836646433356566303837313532623939353631396163333337383530 Jan 14 01:40:35.133000 audit: BPF prog-id=261 op=LOAD Jan 14 01:40:35.133000 audit[5962]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=2624 pid=5962 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:35.133000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3836646433356566303837313532623939353631396163333337383530 Jan 14 01:40:35.191940 containerd[1684]: time="2026-01-14T01:40:35.191903319Z" level=info msg="StartContainer for \"86dd35ef087152b995619ac33785094422c7577f78a8c087d90a30f8915d63f4\" returns successfully" Jan 14 01:40:35.310697 kubelet[2924]: E0114 01:40:35.310661 2924 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.22.183:33988->10.0.22.196:2379: read: connection timed out" Jan 14 01:40:35.573002 systemd[1]: cri-containerd-3649c6091c17dc95dac3d5b904856065a3e0232583e07c3da1815ca8e7bf266a.scope: Deactivated successfully. Jan 14 01:40:35.573273 systemd[1]: cri-containerd-3649c6091c17dc95dac3d5b904856065a3e0232583e07c3da1815ca8e7bf266a.scope: Consumed 1min 9.094s CPU time, 118.1M memory peak. Jan 14 01:40:35.575000 audit: BPF prog-id=146 op=UNLOAD Jan 14 01:40:35.575000 audit: BPF prog-id=150 op=UNLOAD Jan 14 01:40:35.577469 containerd[1684]: time="2026-01-14T01:40:35.577427849Z" level=info msg="received container exit event container_id:\"3649c6091c17dc95dac3d5b904856065a3e0232583e07c3da1815ca8e7bf266a\" id:\"3649c6091c17dc95dac3d5b904856065a3e0232583e07c3da1815ca8e7bf266a\" pid:3246 exit_status:1 exited_at:{seconds:1768354835 nanos:576518576}" Jan 14 01:40:35.604442 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-3649c6091c17dc95dac3d5b904856065a3e0232583e07c3da1815ca8e7bf266a-rootfs.mount: Deactivated successfully. Jan 14 01:40:35.772971 kubelet[2924]: E0114 01:40:35.772926 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-fcdf56664-kjljc" podUID="5694b728-96e4-405e-ad55-bbb10255a07e" Jan 14 01:40:36.051838 kubelet[2924]: I0114 01:40:36.051812 2924 scope.go:117] "RemoveContainer" containerID="3649c6091c17dc95dac3d5b904856065a3e0232583e07c3da1815ca8e7bf266a" Jan 14 01:40:36.075218 containerd[1684]: time="2026-01-14T01:40:36.075172466Z" level=info msg="CreateContainer within sandbox \"2aabc73ce5ac253664affec3bd413335d4c368cc331f327529143dcd8fe9fbe9\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Jan 14 01:40:36.091363 containerd[1684]: time="2026-01-14T01:40:36.091307315Z" level=info msg="Container 48c97434a7035793c720cdd76217de8a2a12a07b01bbaca8c8b532cc702929ad: CDI devices from CRI Config.CDIDevices: []" Jan 14 01:40:36.102422 containerd[1684]: time="2026-01-14T01:40:36.102367212Z" level=info msg="CreateContainer within sandbox \"2aabc73ce5ac253664affec3bd413335d4c368cc331f327529143dcd8fe9fbe9\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"48c97434a7035793c720cdd76217de8a2a12a07b01bbaca8c8b532cc702929ad\"" Jan 14 01:40:36.102965 containerd[1684]: time="2026-01-14T01:40:36.102946958Z" level=info msg="StartContainer for \"48c97434a7035793c720cdd76217de8a2a12a07b01bbaca8c8b532cc702929ad\"" Jan 14 01:40:36.104909 containerd[1684]: time="2026-01-14T01:40:36.104883395Z" level=info msg="connecting to shim 48c97434a7035793c720cdd76217de8a2a12a07b01bbaca8c8b532cc702929ad" address="unix:///run/containerd/s/18a1b34cd3f078a2bdee90eb3eb3d846643daeecc4b11bff748e6d0e041b0916" protocol=ttrpc version=3 Jan 14 01:40:36.127009 systemd[1]: Started cri-containerd-48c97434a7035793c720cdd76217de8a2a12a07b01bbaca8c8b532cc702929ad.scope - libcontainer container 48c97434a7035793c720cdd76217de8a2a12a07b01bbaca8c8b532cc702929ad. Jan 14 01:40:36.140000 audit: BPF prog-id=262 op=LOAD Jan 14 01:40:36.140000 audit: BPF prog-id=263 op=LOAD Jan 14 01:40:36.140000 audit[6004]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000138238 a2=98 a3=0 items=0 ppid=3084 pid=6004 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:36.140000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3438633937343334613730333537393363373230636464373632313764 Jan 14 01:40:36.140000 audit: BPF prog-id=263 op=UNLOAD Jan 14 01:40:36.140000 audit[6004]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3084 pid=6004 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:36.140000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3438633937343334613730333537393363373230636464373632313764 Jan 14 01:40:36.140000 audit: BPF prog-id=264 op=LOAD Jan 14 01:40:36.140000 audit[6004]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000138488 a2=98 a3=0 items=0 ppid=3084 pid=6004 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:36.140000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3438633937343334613730333537393363373230636464373632313764 Jan 14 01:40:36.140000 audit: BPF prog-id=265 op=LOAD Jan 14 01:40:36.140000 audit[6004]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000138218 a2=98 a3=0 items=0 ppid=3084 pid=6004 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:36.140000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3438633937343334613730333537393363373230636464373632313764 Jan 14 01:40:36.140000 audit: BPF prog-id=265 op=UNLOAD Jan 14 01:40:36.140000 audit[6004]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3084 pid=6004 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:36.140000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3438633937343334613730333537393363373230636464373632313764 Jan 14 01:40:36.140000 audit: BPF prog-id=264 op=UNLOAD Jan 14 01:40:36.140000 audit[6004]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3084 pid=6004 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:36.140000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3438633937343334613730333537393363373230636464373632313764 Jan 14 01:40:36.140000 audit: BPF prog-id=266 op=LOAD Jan 14 01:40:36.140000 audit[6004]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001386e8 a2=98 a3=0 items=0 ppid=3084 pid=6004 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:36.140000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3438633937343334613730333537393363373230636464373632313764 Jan 14 01:40:36.162559 containerd[1684]: time="2026-01-14T01:40:36.162522183Z" level=info msg="StartContainer for \"48c97434a7035793c720cdd76217de8a2a12a07b01bbaca8c8b532cc702929ad\" returns successfully" Jan 14 01:40:37.769072 kubelet[2924]: E0114 01:40:37.769032 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-gknfk" podUID="9e0350f1-074c-4801-8433-6b63afe081c2" Jan 14 01:40:38.770404 kubelet[2924]: E0114 01:40:38.770231 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-7rtbs" podUID="327b5d4f-3ff4-43f6-98c0-8969f3f3f2d2" Jan 14 01:40:38.771996 kubelet[2924]: E0114 01:40:38.771937 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-84b7d46b9c-rw586" podUID="7534bfa4-9af8-4640-8306-673448a61bb0" Jan 14 01:40:39.718968 kubelet[2924]: E0114 01:40:39.717933 2924 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.22.183:33792->10.0.22.196:2379: read: connection timed out" event="&Event{ObjectMeta:{kube-apiserver-ci-4578-0-0-p-557efd55ff.188a75591a499385 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-ci-4578-0-0-p-557efd55ff,UID:27e73373a1da7666e92e1356b1b10768,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Readiness probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:ci-4578-0-0-p-557efd55ff,},FirstTimestamp:2026-01-14 01:40:29.249909637 +0000 UTC m=+522.582138983,LastTimestamp:2026-01-14 01:40:29.249909637 +0000 UTC m=+522.582138983,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4578-0-0-p-557efd55ff,}" Jan 14 01:40:40.038520 kubelet[2924]: I0114 01:40:40.038442 2924 status_manager.go:890] "Failed to get status for pod" podUID="141622e70e2aacc762987119dd29e1d8" pod="kube-system/kube-controller-manager-ci-4578-0-0-p-557efd55ff" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.22.183:33890->10.0.22.196:2379: read: connection timed out" Jan 14 01:40:40.763881 systemd[1]: cri-containerd-b41062327d0f4eac335e2505dccc3170193ba8efe0145922c43ed640c2af6956.scope: Deactivated successfully. Jan 14 01:40:40.766340 kernel: kauditd_printk_skb: 40 callbacks suppressed Jan 14 01:40:40.766488 kernel: audit: type=1334 audit(1768354840.764:943): prog-id=267 op=LOAD Jan 14 01:40:40.764000 audit: BPF prog-id=267 op=LOAD Jan 14 01:40:40.764454 systemd[1]: cri-containerd-b41062327d0f4eac335e2505dccc3170193ba8efe0145922c43ed640c2af6956.scope: Consumed 3.708s CPU time, 24.7M memory peak, 256K read from disk. Jan 14 01:40:40.767720 containerd[1684]: time="2026-01-14T01:40:40.767629812Z" level=info msg="received container exit event container_id:\"b41062327d0f4eac335e2505dccc3170193ba8efe0145922c43ed640c2af6956\" id:\"b41062327d0f4eac335e2505dccc3170193ba8efe0145922c43ed640c2af6956\" pid:2762 exit_status:1 exited_at:{seconds:1768354840 nanos:767321368}" Jan 14 01:40:40.764000 audit: BPF prog-id=88 op=UNLOAD Jan 14 01:40:40.769340 kernel: audit: type=1334 audit(1768354840.764:944): prog-id=88 op=UNLOAD Jan 14 01:40:40.767000 audit: BPF prog-id=103 op=UNLOAD Jan 14 01:40:40.770920 kernel: audit: type=1334 audit(1768354840.767:945): prog-id=103 op=UNLOAD Jan 14 01:40:40.767000 audit: BPF prog-id=107 op=UNLOAD Jan 14 01:40:40.773860 kernel: audit: type=1334 audit(1768354840.767:946): prog-id=107 op=UNLOAD Jan 14 01:40:40.794578 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-b41062327d0f4eac335e2505dccc3170193ba8efe0145922c43ed640c2af6956-rootfs.mount: Deactivated successfully. Jan 14 01:40:41.072126 kubelet[2924]: I0114 01:40:41.071948 2924 scope.go:117] "RemoveContainer" containerID="b41062327d0f4eac335e2505dccc3170193ba8efe0145922c43ed640c2af6956" Jan 14 01:40:41.076541 containerd[1684]: time="2026-01-14T01:40:41.076467447Z" level=info msg="CreateContainer within sandbox \"0434e14efaa8c4e670c5ddc44ad3ee32972bc016d7d0915a7a5150e5417c6db9\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Jan 14 01:40:41.094869 containerd[1684]: time="2026-01-14T01:40:41.091772346Z" level=info msg="Container 6a2a7a4f7107f03a4a101d6ecc66d42eb75341511a0a389c67c8fe697d428bf9: CDI devices from CRI Config.CDIDevices: []" Jan 14 01:40:41.098538 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2627680074.mount: Deactivated successfully. Jan 14 01:40:41.106952 containerd[1684]: time="2026-01-14T01:40:41.106913044Z" level=info msg="CreateContainer within sandbox \"0434e14efaa8c4e670c5ddc44ad3ee32972bc016d7d0915a7a5150e5417c6db9\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"6a2a7a4f7107f03a4a101d6ecc66d42eb75341511a0a389c67c8fe697d428bf9\"" Jan 14 01:40:41.107641 containerd[1684]: time="2026-01-14T01:40:41.107615803Z" level=info msg="StartContainer for \"6a2a7a4f7107f03a4a101d6ecc66d42eb75341511a0a389c67c8fe697d428bf9\"" Jan 14 01:40:41.109406 containerd[1684]: time="2026-01-14T01:40:41.109375655Z" level=info msg="connecting to shim 6a2a7a4f7107f03a4a101d6ecc66d42eb75341511a0a389c67c8fe697d428bf9" address="unix:///run/containerd/s/60f022b80d018d7878e94938b05bdb3616a524ee98254817c9adf6503f7bc1d8" protocol=ttrpc version=3 Jan 14 01:40:41.136038 systemd[1]: Started cri-containerd-6a2a7a4f7107f03a4a101d6ecc66d42eb75341511a0a389c67c8fe697d428bf9.scope - libcontainer container 6a2a7a4f7107f03a4a101d6ecc66d42eb75341511a0a389c67c8fe697d428bf9. Jan 14 01:40:41.148000 audit: BPF prog-id=268 op=LOAD Jan 14 01:40:41.149000 audit: BPF prog-id=269 op=LOAD Jan 14 01:40:41.151277 kernel: audit: type=1334 audit(1768354841.148:947): prog-id=268 op=LOAD Jan 14 01:40:41.151342 kernel: audit: type=1334 audit(1768354841.149:948): prog-id=269 op=LOAD Jan 14 01:40:41.149000 audit[6055]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00010c238 a2=98 a3=0 items=0 ppid=2640 pid=6055 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:41.154078 kernel: audit: type=1300 audit(1768354841.149:948): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00010c238 a2=98 a3=0 items=0 ppid=2640 pid=6055 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:41.149000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3661326137613466373130376630336134613130316436656363363664 Jan 14 01:40:41.158640 kernel: audit: type=1327 audit(1768354841.149:948): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3661326137613466373130376630336134613130316436656363363664 Jan 14 01:40:41.149000 audit: BPF prog-id=269 op=UNLOAD Jan 14 01:40:41.162023 kernel: audit: type=1334 audit(1768354841.149:949): prog-id=269 op=UNLOAD Jan 14 01:40:41.149000 audit[6055]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2640 pid=6055 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:41.164642 kernel: audit: type=1300 audit(1768354841.149:949): arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2640 pid=6055 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:41.149000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3661326137613466373130376630336134613130316436656363363664 Jan 14 01:40:41.151000 audit: BPF prog-id=270 op=LOAD Jan 14 01:40:41.151000 audit[6055]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00010c488 a2=98 a3=0 items=0 ppid=2640 pid=6055 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:41.151000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3661326137613466373130376630336134613130316436656363363664 Jan 14 01:40:41.151000 audit: BPF prog-id=271 op=LOAD Jan 14 01:40:41.151000 audit[6055]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00010c218 a2=98 a3=0 items=0 ppid=2640 pid=6055 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:41.151000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3661326137613466373130376630336134613130316436656363363664 Jan 14 01:40:41.151000 audit: BPF prog-id=271 op=UNLOAD Jan 14 01:40:41.151000 audit[6055]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2640 pid=6055 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:41.151000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3661326137613466373130376630336134613130316436656363363664 Jan 14 01:40:41.151000 audit: BPF prog-id=270 op=UNLOAD Jan 14 01:40:41.151000 audit[6055]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2640 pid=6055 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:41.151000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3661326137613466373130376630336134613130316436656363363664 Jan 14 01:40:41.151000 audit: BPF prog-id=272 op=LOAD Jan 14 01:40:41.151000 audit[6055]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00010c6e8 a2=98 a3=0 items=0 ppid=2640 pid=6055 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:41.151000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3661326137613466373130376630336134613130316436656363363664 Jan 14 01:40:41.200826 containerd[1684]: time="2026-01-14T01:40:41.200737465Z" level=info msg="StartContainer for \"6a2a7a4f7107f03a4a101d6ecc66d42eb75341511a0a389c67c8fe697d428bf9\" returns successfully" Jan 14 01:40:42.769694 kubelet[2924]: E0114 01:40:42.769632 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-bfb5m" podUID="e9f3a916-9cf2-4ab2-9a2c-762b6410a5d2" Jan 14 01:40:45.311823 kubelet[2924]: E0114 01:40:45.311270 2924 controller.go:195] "Failed to update lease" err="Put \"https://10.0.22.183:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4578-0-0-p-557efd55ff?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 14 01:40:46.778816 kubelet[2924]: E0114 01:40:46.778721 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-qjrqf" podUID="a4130f46-1c6e-474c-9a1c-5fd1820934c0" Jan 14 01:40:47.336821 systemd[1]: cri-containerd-48c97434a7035793c720cdd76217de8a2a12a07b01bbaca8c8b532cc702929ad.scope: Deactivated successfully. Jan 14 01:40:47.338735 containerd[1684]: time="2026-01-14T01:40:47.338693208Z" level=info msg="received container exit event container_id:\"48c97434a7035793c720cdd76217de8a2a12a07b01bbaca8c8b532cc702929ad\" id:\"48c97434a7035793c720cdd76217de8a2a12a07b01bbaca8c8b532cc702929ad\" pid:6017 exit_status:1 exited_at:{seconds:1768354847 nanos:338296025}" Jan 14 01:40:47.339000 audit: BPF prog-id=262 op=UNLOAD Jan 14 01:40:47.341357 kernel: kauditd_printk_skb: 16 callbacks suppressed Jan 14 01:40:47.341430 kernel: audit: type=1334 audit(1768354847.339:955): prog-id=262 op=UNLOAD Jan 14 01:40:47.339000 audit: BPF prog-id=266 op=UNLOAD Jan 14 01:40:47.343208 kernel: audit: type=1334 audit(1768354847.339:956): prog-id=266 op=UNLOAD Jan 14 01:40:47.368810 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-48c97434a7035793c720cdd76217de8a2a12a07b01bbaca8c8b532cc702929ad-rootfs.mount: Deactivated successfully. Jan 14 01:40:48.095794 kubelet[2924]: I0114 01:40:48.095111 2924 scope.go:117] "RemoveContainer" containerID="3649c6091c17dc95dac3d5b904856065a3e0232583e07c3da1815ca8e7bf266a" Jan 14 01:40:48.095794 kubelet[2924]: I0114 01:40:48.095344 2924 scope.go:117] "RemoveContainer" containerID="48c97434a7035793c720cdd76217de8a2a12a07b01bbaca8c8b532cc702929ad" Jan 14 01:40:48.095794 kubelet[2924]: E0114 01:40:48.095465 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tigera-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=tigera-operator pod=tigera-operator-7dcd859c48-gjw9x_tigera-operator(d63518e5-af84-444e-9f27-a47ff2026ae3)\"" pod="tigera-operator/tigera-operator-7dcd859c48-gjw9x" podUID="d63518e5-af84-444e-9f27-a47ff2026ae3" Jan 14 01:40:48.097476 containerd[1684]: time="2026-01-14T01:40:48.097448534Z" level=info msg="RemoveContainer for \"3649c6091c17dc95dac3d5b904856065a3e0232583e07c3da1815ca8e7bf266a\"" Jan 14 01:40:48.104008 containerd[1684]: time="2026-01-14T01:40:48.103962814Z" level=info msg="RemoveContainer for \"3649c6091c17dc95dac3d5b904856065a3e0232583e07c3da1815ca8e7bf266a\" returns successfully" Jan 14 01:40:50.772915 kubelet[2924]: E0114 01:40:50.772864 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-84b7d46b9c-rw586" podUID="7534bfa4-9af8-4640-8306-673448a61bb0" Jan 14 01:40:50.774499 kubelet[2924]: E0114 01:40:50.774216 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-fcdf56664-kjljc" podUID="5694b728-96e4-405e-ad55-bbb10255a07e" Jan 14 01:40:50.774499 kubelet[2924]: E0114 01:40:50.774352 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5f6c79b459-7rtbs" podUID="327b5d4f-3ff4-43f6-98c0-8969f3f3f2d2" Jan 14 01:40:50.775110 kubelet[2924]: E0114 01:40:50.774701 2924 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-gknfk" podUID="9e0350f1-074c-4801-8433-6b63afe081c2"