Jan 16 21:15:57.035489 kernel: Linux version 6.12.65-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.1_p20250801 p4) 14.3.1 20250801, GNU ld (Gentoo 2.45 p3) 2.45.0) #1 SMP PREEMPT_DYNAMIC Fri Jan 16 18:44:02 -00 2026 Jan 16 21:15:57.035519 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack verity.usrhash=e880b5400e832e1de59b993d9ba6b86a9089175f10b4985da8b7b47cc8c74099 Jan 16 21:15:57.035529 kernel: BIOS-provided physical RAM map: Jan 16 21:15:57.035536 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ffff] usable Jan 16 21:15:57.035542 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000007fffff] usable Jan 16 21:15:57.035548 kernel: BIOS-e820: [mem 0x0000000000800000-0x0000000000807fff] ACPI NVS Jan 16 21:15:57.035557 kernel: BIOS-e820: [mem 0x0000000000808000-0x000000000080afff] usable Jan 16 21:15:57.035564 kernel: BIOS-e820: [mem 0x000000000080b000-0x000000000080bfff] ACPI NVS Jan 16 21:15:57.035570 kernel: BIOS-e820: [mem 0x000000000080c000-0x0000000000810fff] usable Jan 16 21:15:57.035577 kernel: BIOS-e820: [mem 0x0000000000811000-0x00000000008fffff] ACPI NVS Jan 16 21:15:57.035583 kernel: BIOS-e820: [mem 0x0000000000900000-0x000000007e93efff] usable Jan 16 21:15:57.035589 kernel: BIOS-e820: [mem 0x000000007e93f000-0x000000007e9fffff] reserved Jan 16 21:15:57.035605 kernel: BIOS-e820: [mem 0x000000007ea00000-0x000000007ec70fff] usable Jan 16 21:15:57.035612 kernel: BIOS-e820: [mem 0x000000007ec71000-0x000000007ed84fff] reserved Jan 16 21:15:57.035622 kernel: BIOS-e820: [mem 0x000000007ed85000-0x000000007f8ecfff] usable Jan 16 21:15:57.035628 kernel: BIOS-e820: [mem 0x000000007f8ed000-0x000000007fb6cfff] reserved Jan 16 21:15:57.035635 kernel: BIOS-e820: [mem 0x000000007fb6d000-0x000000007fb7efff] ACPI data Jan 16 21:15:57.035642 kernel: BIOS-e820: [mem 0x000000007fb7f000-0x000000007fbfefff] ACPI NVS Jan 16 21:15:57.035648 kernel: BIOS-e820: [mem 0x000000007fbff000-0x000000007feaefff] usable Jan 16 21:15:57.035655 kernel: BIOS-e820: [mem 0x000000007feaf000-0x000000007feb2fff] reserved Jan 16 21:15:57.035663 kernel: BIOS-e820: [mem 0x000000007feb3000-0x000000007feb4fff] ACPI NVS Jan 16 21:15:57.035670 kernel: BIOS-e820: [mem 0x000000007feb5000-0x000000007feebfff] usable Jan 16 21:15:57.035677 kernel: BIOS-e820: [mem 0x000000007feec000-0x000000007ff6ffff] reserved Jan 16 21:15:57.035683 kernel: BIOS-e820: [mem 0x000000007ff70000-0x000000007fffffff] ACPI NVS Jan 16 21:15:57.035690 kernel: BIOS-e820: [mem 0x00000000e0000000-0x00000000efffffff] reserved Jan 16 21:15:57.035696 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Jan 16 21:15:57.035703 kernel: BIOS-e820: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Jan 16 21:15:57.035709 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000017fffffff] usable Jan 16 21:15:57.035716 kernel: NX (Execute Disable) protection: active Jan 16 21:15:57.035722 kernel: APIC: Static calls initialized Jan 16 21:15:57.035729 kernel: e820: update [mem 0x7df7f018-0x7df88a57] usable ==> usable Jan 16 21:15:57.035738 kernel: e820: update [mem 0x7df57018-0x7df7e457] usable ==> usable Jan 16 21:15:57.035744 kernel: extended physical RAM map: Jan 16 21:15:57.035751 kernel: reserve setup_data: [mem 0x0000000000000000-0x000000000009ffff] usable Jan 16 21:15:57.035758 kernel: reserve setup_data: [mem 0x0000000000100000-0x00000000007fffff] usable Jan 16 21:15:57.035764 kernel: reserve setup_data: [mem 0x0000000000800000-0x0000000000807fff] ACPI NVS Jan 16 21:15:57.035771 kernel: reserve setup_data: [mem 0x0000000000808000-0x000000000080afff] usable Jan 16 21:15:57.035778 kernel: reserve setup_data: [mem 0x000000000080b000-0x000000000080bfff] ACPI NVS Jan 16 21:15:57.035784 kernel: reserve setup_data: [mem 0x000000000080c000-0x0000000000810fff] usable Jan 16 21:15:57.035791 kernel: reserve setup_data: [mem 0x0000000000811000-0x00000000008fffff] ACPI NVS Jan 16 21:15:57.035803 kernel: reserve setup_data: [mem 0x0000000000900000-0x000000007df57017] usable Jan 16 21:15:57.035810 kernel: reserve setup_data: [mem 0x000000007df57018-0x000000007df7e457] usable Jan 16 21:15:57.035817 kernel: reserve setup_data: [mem 0x000000007df7e458-0x000000007df7f017] usable Jan 16 21:15:57.035824 kernel: reserve setup_data: [mem 0x000000007df7f018-0x000000007df88a57] usable Jan 16 21:15:57.035833 kernel: reserve setup_data: [mem 0x000000007df88a58-0x000000007e93efff] usable Jan 16 21:15:57.035840 kernel: reserve setup_data: [mem 0x000000007e93f000-0x000000007e9fffff] reserved Jan 16 21:15:57.035847 kernel: reserve setup_data: [mem 0x000000007ea00000-0x000000007ec70fff] usable Jan 16 21:15:57.035854 kernel: reserve setup_data: [mem 0x000000007ec71000-0x000000007ed84fff] reserved Jan 16 21:15:57.035861 kernel: reserve setup_data: [mem 0x000000007ed85000-0x000000007f8ecfff] usable Jan 16 21:15:57.035868 kernel: reserve setup_data: [mem 0x000000007f8ed000-0x000000007fb6cfff] reserved Jan 16 21:15:57.035875 kernel: reserve setup_data: [mem 0x000000007fb6d000-0x000000007fb7efff] ACPI data Jan 16 21:15:57.035882 kernel: reserve setup_data: [mem 0x000000007fb7f000-0x000000007fbfefff] ACPI NVS Jan 16 21:15:57.035889 kernel: reserve setup_data: [mem 0x000000007fbff000-0x000000007feaefff] usable Jan 16 21:15:57.035896 kernel: reserve setup_data: [mem 0x000000007feaf000-0x000000007feb2fff] reserved Jan 16 21:15:57.035903 kernel: reserve setup_data: [mem 0x000000007feb3000-0x000000007feb4fff] ACPI NVS Jan 16 21:15:57.035912 kernel: reserve setup_data: [mem 0x000000007feb5000-0x000000007feebfff] usable Jan 16 21:15:57.035919 kernel: reserve setup_data: [mem 0x000000007feec000-0x000000007ff6ffff] reserved Jan 16 21:15:57.035926 kernel: reserve setup_data: [mem 0x000000007ff70000-0x000000007fffffff] ACPI NVS Jan 16 21:15:57.035933 kernel: reserve setup_data: [mem 0x00000000e0000000-0x00000000efffffff] reserved Jan 16 21:15:57.035940 kernel: reserve setup_data: [mem 0x00000000feffc000-0x00000000feffffff] reserved Jan 16 21:15:57.035947 kernel: reserve setup_data: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Jan 16 21:15:57.035954 kernel: reserve setup_data: [mem 0x0000000100000000-0x000000017fffffff] usable Jan 16 21:15:57.035961 kernel: efi: EFI v2.7 by EDK II Jan 16 21:15:57.035968 kernel: efi: SMBIOS=0x7f972000 ACPI=0x7fb7e000 ACPI 2.0=0x7fb7e014 MEMATTR=0x7dfd8018 RNG=0x7fb72018 Jan 16 21:15:57.035975 kernel: random: crng init done Jan 16 21:15:57.035982 kernel: efi: Remove mem139: MMIO range=[0xffc00000-0xffffffff] (4MB) from e820 map Jan 16 21:15:57.035991 kernel: e820: remove [mem 0xffc00000-0xffffffff] reserved Jan 16 21:15:57.035998 kernel: secureboot: Secure boot disabled Jan 16 21:15:57.036005 kernel: SMBIOS 2.8 present. Jan 16 21:15:57.036012 kernel: DMI: STACKIT Cloud OpenStack Nova/Standard PC (Q35 + ICH9, 2009), BIOS unknown 02/02/2022 Jan 16 21:15:57.036019 kernel: DMI: Memory slots populated: 1/1 Jan 16 21:15:57.036026 kernel: Hypervisor detected: KVM Jan 16 21:15:57.036033 kernel: last_pfn = 0x7feec max_arch_pfn = 0x10000000000 Jan 16 21:15:57.036040 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Jan 16 21:15:57.036048 kernel: kvm-clock: using sched offset of 5591379279 cycles Jan 16 21:15:57.036055 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Jan 16 21:15:57.036065 kernel: tsc: Detected 2294.608 MHz processor Jan 16 21:15:57.036073 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Jan 16 21:15:57.036080 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Jan 16 21:15:57.036088 kernel: last_pfn = 0x180000 max_arch_pfn = 0x10000000000 Jan 16 21:15:57.036095 kernel: MTRR map: 4 entries (2 fixed + 2 variable; max 18), built from 8 variable MTRRs Jan 16 21:15:57.036103 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Jan 16 21:15:57.036111 kernel: last_pfn = 0x7feec max_arch_pfn = 0x10000000000 Jan 16 21:15:57.036118 kernel: Using GB pages for direct mapping Jan 16 21:15:57.036127 kernel: ACPI: Early table checksum verification disabled Jan 16 21:15:57.036135 kernel: ACPI: RSDP 0x000000007FB7E014 000024 (v02 BOCHS ) Jan 16 21:15:57.036142 kernel: ACPI: XSDT 0x000000007FB7D0E8 00004C (v01 BOCHS BXPC 00000001 01000013) Jan 16 21:15:57.036150 kernel: ACPI: FACP 0x000000007FB77000 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 16 21:15:57.036157 kernel: ACPI: DSDT 0x000000007FB78000 00423C (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 16 21:15:57.036164 kernel: ACPI: FACS 0x000000007FBDD000 000040 Jan 16 21:15:57.036172 kernel: ACPI: APIC 0x000000007FB76000 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 16 21:15:57.036181 kernel: ACPI: MCFG 0x000000007FB75000 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 16 21:15:57.036188 kernel: ACPI: WAET 0x000000007FB74000 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 16 21:15:57.036196 kernel: ACPI: BGRT 0x000000007FB73000 000038 (v01 INTEL EDK2 00000002 01000013) Jan 16 21:15:57.036203 kernel: ACPI: Reserving FACP table memory at [mem 0x7fb77000-0x7fb770f3] Jan 16 21:15:57.036211 kernel: ACPI: Reserving DSDT table memory at [mem 0x7fb78000-0x7fb7c23b] Jan 16 21:15:57.036218 kernel: ACPI: Reserving FACS table memory at [mem 0x7fbdd000-0x7fbdd03f] Jan 16 21:15:57.036226 kernel: ACPI: Reserving APIC table memory at [mem 0x7fb76000-0x7fb7607f] Jan 16 21:15:57.036235 kernel: ACPI: Reserving MCFG table memory at [mem 0x7fb75000-0x7fb7503b] Jan 16 21:15:57.036242 kernel: ACPI: Reserving WAET table memory at [mem 0x7fb74000-0x7fb74027] Jan 16 21:15:57.036250 kernel: ACPI: Reserving BGRT table memory at [mem 0x7fb73000-0x7fb73037] Jan 16 21:15:57.036257 kernel: No NUMA configuration found Jan 16 21:15:57.036265 kernel: Faking a node at [mem 0x0000000000000000-0x000000017fffffff] Jan 16 21:15:57.036272 kernel: NODE_DATA(0) allocated [mem 0x17fff8dc0-0x17fffffff] Jan 16 21:15:57.036280 kernel: Zone ranges: Jan 16 21:15:57.036288 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Jan 16 21:15:57.036297 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Jan 16 21:15:57.036304 kernel: Normal [mem 0x0000000100000000-0x000000017fffffff] Jan 16 21:15:57.036312 kernel: Device empty Jan 16 21:15:57.036319 kernel: Movable zone start for each node Jan 16 21:15:57.036327 kernel: Early memory node ranges Jan 16 21:15:57.036334 kernel: node 0: [mem 0x0000000000001000-0x000000000009ffff] Jan 16 21:15:57.036341 kernel: node 0: [mem 0x0000000000100000-0x00000000007fffff] Jan 16 21:15:57.036350 kernel: node 0: [mem 0x0000000000808000-0x000000000080afff] Jan 16 21:15:57.036358 kernel: node 0: [mem 0x000000000080c000-0x0000000000810fff] Jan 16 21:15:57.036365 kernel: node 0: [mem 0x0000000000900000-0x000000007e93efff] Jan 16 21:15:57.036373 kernel: node 0: [mem 0x000000007ea00000-0x000000007ec70fff] Jan 16 21:15:57.036382 kernel: node 0: [mem 0x000000007ed85000-0x000000007f8ecfff] Jan 16 21:15:57.036393 kernel: node 0: [mem 0x000000007fbff000-0x000000007feaefff] Jan 16 21:15:57.036403 kernel: node 0: [mem 0x000000007feb5000-0x000000007feebfff] Jan 16 21:15:57.036411 kernel: node 0: [mem 0x0000000100000000-0x000000017fffffff] Jan 16 21:15:57.036419 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000017fffffff] Jan 16 21:15:57.036427 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Jan 16 21:15:57.036437 kernel: On node 0, zone DMA: 96 pages in unavailable ranges Jan 16 21:15:57.036445 kernel: On node 0, zone DMA: 8 pages in unavailable ranges Jan 16 21:15:57.036453 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Jan 16 21:15:57.036460 kernel: On node 0, zone DMA: 239 pages in unavailable ranges Jan 16 21:15:57.036470 kernel: On node 0, zone DMA32: 193 pages in unavailable ranges Jan 16 21:15:57.036478 kernel: On node 0, zone DMA32: 276 pages in unavailable ranges Jan 16 21:15:57.036486 kernel: On node 0, zone DMA32: 786 pages in unavailable ranges Jan 16 21:15:57.036494 kernel: On node 0, zone DMA32: 6 pages in unavailable ranges Jan 16 21:15:57.036502 kernel: On node 0, zone Normal: 276 pages in unavailable ranges Jan 16 21:15:57.036510 kernel: ACPI: PM-Timer IO Port: 0x608 Jan 16 21:15:57.036518 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Jan 16 21:15:57.036526 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Jan 16 21:15:57.038749 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Jan 16 21:15:57.038767 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Jan 16 21:15:57.038776 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Jan 16 21:15:57.038785 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Jan 16 21:15:57.038793 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Jan 16 21:15:57.038802 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Jan 16 21:15:57.038810 kernel: TSC deadline timer available Jan 16 21:15:57.038821 kernel: CPU topo: Max. logical packages: 2 Jan 16 21:15:57.038830 kernel: CPU topo: Max. logical dies: 2 Jan 16 21:15:57.038838 kernel: CPU topo: Max. dies per package: 1 Jan 16 21:15:57.038846 kernel: CPU topo: Max. threads per core: 1 Jan 16 21:15:57.038854 kernel: CPU topo: Num. cores per package: 1 Jan 16 21:15:57.038862 kernel: CPU topo: Num. threads per package: 1 Jan 16 21:15:57.038870 kernel: CPU topo: Allowing 2 present CPUs plus 0 hotplug CPUs Jan 16 21:15:57.038880 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Jan 16 21:15:57.038888 kernel: kvm-guest: KVM setup pv remote TLB flush Jan 16 21:15:57.038896 kernel: kvm-guest: setup PV sched yield Jan 16 21:15:57.038904 kernel: [mem 0x80000000-0xdfffffff] available for PCI devices Jan 16 21:15:57.038913 kernel: Booting paravirtualized kernel on KVM Jan 16 21:15:57.038921 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Jan 16 21:15:57.038929 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Jan 16 21:15:57.038938 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u1048576 Jan 16 21:15:57.038948 kernel: pcpu-alloc: s207832 r8192 d29736 u1048576 alloc=1*2097152 Jan 16 21:15:57.038956 kernel: pcpu-alloc: [0] 0 1 Jan 16 21:15:57.038964 kernel: kvm-guest: PV spinlocks enabled Jan 16 21:15:57.038972 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Jan 16 21:15:57.038982 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack verity.usrhash=e880b5400e832e1de59b993d9ba6b86a9089175f10b4985da8b7b47cc8c74099 Jan 16 21:15:57.038991 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Jan 16 21:15:57.039001 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Jan 16 21:15:57.039009 kernel: Fallback order for Node 0: 0 Jan 16 21:15:57.039017 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1046694 Jan 16 21:15:57.039025 kernel: Policy zone: Normal Jan 16 21:15:57.039033 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Jan 16 21:15:57.039041 kernel: software IO TLB: area num 2. Jan 16 21:15:57.039049 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Jan 16 21:15:57.039059 kernel: ftrace: allocating 40128 entries in 157 pages Jan 16 21:15:57.039067 kernel: ftrace: allocated 157 pages with 5 groups Jan 16 21:15:57.039075 kernel: Dynamic Preempt: voluntary Jan 16 21:15:57.039083 kernel: rcu: Preemptible hierarchical RCU implementation. Jan 16 21:15:57.039093 kernel: rcu: RCU event tracing is enabled. Jan 16 21:15:57.039101 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Jan 16 21:15:57.039109 kernel: Trampoline variant of Tasks RCU enabled. Jan 16 21:15:57.039118 kernel: Rude variant of Tasks RCU enabled. Jan 16 21:15:57.039127 kernel: Tracing variant of Tasks RCU enabled. Jan 16 21:15:57.039136 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Jan 16 21:15:57.039144 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Jan 16 21:15:57.039152 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jan 16 21:15:57.039160 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jan 16 21:15:57.039169 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jan 16 21:15:57.039177 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 Jan 16 21:15:57.039187 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Jan 16 21:15:57.039195 kernel: Console: colour dummy device 80x25 Jan 16 21:15:57.039203 kernel: printk: legacy console [tty0] enabled Jan 16 21:15:57.039211 kernel: printk: legacy console [ttyS0] enabled Jan 16 21:15:57.039220 kernel: ACPI: Core revision 20240827 Jan 16 21:15:57.039228 kernel: APIC: Switch to symmetric I/O mode setup Jan 16 21:15:57.039236 kernel: x2apic enabled Jan 16 21:15:57.039244 kernel: APIC: Switched APIC routing to: physical x2apic Jan 16 21:15:57.039254 kernel: kvm-guest: APIC: send_IPI_mask() replaced with kvm_send_ipi_mask() Jan 16 21:15:57.039263 kernel: kvm-guest: APIC: send_IPI_mask_allbutself() replaced with kvm_send_ipi_mask_allbutself() Jan 16 21:15:57.039271 kernel: kvm-guest: setup PV IPIs Jan 16 21:15:57.039279 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x21134f58f0d, max_idle_ns: 440795217993 ns Jan 16 21:15:57.039287 kernel: Calibrating delay loop (skipped) preset value.. 4589.21 BogoMIPS (lpj=2294608) Jan 16 21:15:57.039295 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Jan 16 21:15:57.039305 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 Jan 16 21:15:57.039313 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 Jan 16 21:15:57.039321 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Jan 16 21:15:57.039328 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall and VM exit Jan 16 21:15:57.039335 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Jan 16 21:15:57.039343 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Jan 16 21:15:57.039351 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Jan 16 21:15:57.039359 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Jan 16 21:15:57.039367 kernel: TAA: Mitigation: Clear CPU buffers Jan 16 21:15:57.039374 kernel: MMIO Stale Data: Mitigation: Clear CPU buffers Jan 16 21:15:57.039382 kernel: active return thunk: its_return_thunk Jan 16 21:15:57.039391 kernel: ITS: Mitigation: Aligned branch/return thunks Jan 16 21:15:57.039399 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Jan 16 21:15:57.039407 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Jan 16 21:15:57.039414 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Jan 16 21:15:57.039422 kernel: x86/fpu: Supporting XSAVE feature 0x020: 'AVX-512 opmask' Jan 16 21:15:57.039429 kernel: x86/fpu: Supporting XSAVE feature 0x040: 'AVX-512 Hi256' Jan 16 21:15:57.039437 kernel: x86/fpu: Supporting XSAVE feature 0x080: 'AVX-512 ZMM_Hi256' Jan 16 21:15:57.039445 kernel: x86/fpu: Supporting XSAVE feature 0x200: 'Protection Keys User registers' Jan 16 21:15:57.039452 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Jan 16 21:15:57.039462 kernel: x86/fpu: xstate_offset[5]: 832, xstate_sizes[5]: 64 Jan 16 21:15:57.039469 kernel: x86/fpu: xstate_offset[6]: 896, xstate_sizes[6]: 512 Jan 16 21:15:57.039477 kernel: x86/fpu: xstate_offset[7]: 1408, xstate_sizes[7]: 1024 Jan 16 21:15:57.039484 kernel: x86/fpu: xstate_offset[9]: 2432, xstate_sizes[9]: 8 Jan 16 21:15:57.039492 kernel: x86/fpu: Enabled xstate features 0x2e7, context size is 2440 bytes, using 'compacted' format. Jan 16 21:15:57.039500 kernel: Freeing SMP alternatives memory: 32K Jan 16 21:15:57.039507 kernel: pid_max: default: 32768 minimum: 301 Jan 16 21:15:57.039515 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Jan 16 21:15:57.039522 kernel: landlock: Up and running. Jan 16 21:15:57.039530 kernel: SELinux: Initializing. Jan 16 21:15:57.039537 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Jan 16 21:15:57.039545 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Jan 16 21:15:57.039555 kernel: smpboot: CPU0: Intel(R) Xeon(R) Silver 4316 CPU @ 2.30GHz (family: 0x6, model: 0x6a, stepping: 0x6) Jan 16 21:15:57.039563 kernel: Performance Events: PEBS fmt0-, Icelake events, full-width counters, Intel PMU driver. Jan 16 21:15:57.039571 kernel: ... version: 2 Jan 16 21:15:57.039579 kernel: ... bit width: 48 Jan 16 21:15:57.039587 kernel: ... generic registers: 8 Jan 16 21:15:57.039991 kernel: ... value mask: 0000ffffffffffff Jan 16 21:15:57.040206 kernel: ... max period: 00007fffffffffff Jan 16 21:15:57.040218 kernel: ... fixed-purpose events: 3 Jan 16 21:15:57.040226 kernel: ... event mask: 00000007000000ff Jan 16 21:15:57.040234 kernel: signal: max sigframe size: 3632 Jan 16 21:15:57.040243 kernel: rcu: Hierarchical SRCU implementation. Jan 16 21:15:57.040251 kernel: rcu: Max phase no-delay instances is 400. Jan 16 21:15:57.040260 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Jan 16 21:15:57.040268 kernel: smp: Bringing up secondary CPUs ... Jan 16 21:15:57.040276 kernel: smpboot: x86: Booting SMP configuration: Jan 16 21:15:57.040286 kernel: .... node #0, CPUs: #1 Jan 16 21:15:57.040294 kernel: smp: Brought up 1 node, 2 CPUs Jan 16 21:15:57.040302 kernel: smpboot: Total of 2 processors activated (9178.43 BogoMIPS) Jan 16 21:15:57.040311 kernel: Memory: 3969768K/4186776K available (14336K kernel code, 2445K rwdata, 31644K rodata, 15536K init, 2500K bss, 212128K reserved, 0K cma-reserved) Jan 16 21:15:57.040320 kernel: devtmpfs: initialized Jan 16 21:15:57.040328 kernel: x86/mm: Memory block size: 128MB Jan 16 21:15:57.040336 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x00800000-0x00807fff] (32768 bytes) Jan 16 21:15:57.040346 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x0080b000-0x0080bfff] (4096 bytes) Jan 16 21:15:57.040354 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x00811000-0x008fffff] (978944 bytes) Jan 16 21:15:57.040362 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7fb7f000-0x7fbfefff] (524288 bytes) Jan 16 21:15:57.040370 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7feb3000-0x7feb4fff] (8192 bytes) Jan 16 21:15:57.040378 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7ff70000-0x7fffffff] (589824 bytes) Jan 16 21:15:57.040387 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Jan 16 21:15:57.040395 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Jan 16 21:15:57.040405 kernel: pinctrl core: initialized pinctrl subsystem Jan 16 21:15:57.040414 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Jan 16 21:15:57.040422 kernel: audit: initializing netlink subsys (disabled) Jan 16 21:15:57.040431 kernel: audit: type=2000 audit(1768598153.683:1): state=initialized audit_enabled=0 res=1 Jan 16 21:15:57.040439 kernel: thermal_sys: Registered thermal governor 'step_wise' Jan 16 21:15:57.040447 kernel: thermal_sys: Registered thermal governor 'user_space' Jan 16 21:15:57.040456 kernel: cpuidle: using governor menu Jan 16 21:15:57.040466 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Jan 16 21:15:57.040474 kernel: dca service started, version 1.12.1 Jan 16 21:15:57.040483 kernel: PCI: ECAM [mem 0xe0000000-0xefffffff] (base 0xe0000000) for domain 0000 [bus 00-ff] Jan 16 21:15:57.040491 kernel: PCI: Using configuration type 1 for base access Jan 16 21:15:57.040499 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Jan 16 21:15:57.040508 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Jan 16 21:15:57.040516 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Jan 16 21:15:57.040526 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Jan 16 21:15:57.040535 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Jan 16 21:15:57.040543 kernel: ACPI: Added _OSI(Module Device) Jan 16 21:15:57.040551 kernel: ACPI: Added _OSI(Processor Device) Jan 16 21:15:57.040559 kernel: ACPI: Added _OSI(Processor Aggregator Device) Jan 16 21:15:57.040567 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Jan 16 21:15:57.040575 kernel: ACPI: Interpreter enabled Jan 16 21:15:57.040585 kernel: ACPI: PM: (supports S0 S3 S5) Jan 16 21:15:57.040593 kernel: ACPI: Using IOAPIC for interrupt routing Jan 16 21:15:57.040613 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Jan 16 21:15:57.040622 kernel: PCI: Using E820 reservations for host bridge windows Jan 16 21:15:57.040630 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Jan 16 21:15:57.040638 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Jan 16 21:15:57.040811 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Jan 16 21:15:57.040918 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Jan 16 21:15:57.041017 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Jan 16 21:15:57.041028 kernel: PCI host bridge to bus 0000:00 Jan 16 21:15:57.041127 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Jan 16 21:15:57.041227 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Jan 16 21:15:57.041319 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Jan 16 21:15:57.041407 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xdfffffff window] Jan 16 21:15:57.041494 kernel: pci_bus 0000:00: root bus resource [mem 0xf0000000-0xfebfffff window] Jan 16 21:15:57.041581 kernel: pci_bus 0000:00: root bus resource [mem 0x380000000000-0x38e800003fff window] Jan 16 21:15:57.041685 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Jan 16 21:15:57.041800 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint Jan 16 21:15:57.041912 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint Jan 16 21:15:57.042011 kernel: pci 0000:00:01.0: BAR 0 [mem 0x80000000-0x807fffff pref] Jan 16 21:15:57.042112 kernel: pci 0000:00:01.0: BAR 2 [mem 0x38e800000000-0x38e800003fff 64bit pref] Jan 16 21:15:57.042208 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8439e000-0x8439efff] Jan 16 21:15:57.042304 kernel: pci 0000:00:01.0: ROM [mem 0xffff0000-0xffffffff pref] Jan 16 21:15:57.042400 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Jan 16 21:15:57.042717 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 21:15:57.042816 kernel: pci 0000:00:02.0: BAR 0 [mem 0x8439d000-0x8439dfff] Jan 16 21:15:57.043694 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Jan 16 21:15:57.043807 kernel: pci 0000:00:02.0: bridge window [io 0x6000-0x6fff] Jan 16 21:15:57.043908 kernel: pci 0000:00:02.0: bridge window [mem 0x84000000-0x842fffff] Jan 16 21:15:57.044034 kernel: pci 0000:00:02.0: bridge window [mem 0x380000000000-0x3807ffffffff 64bit pref] Jan 16 21:15:57.044681 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 21:15:57.044786 kernel: pci 0000:00:02.1: BAR 0 [mem 0x8439c000-0x8439cfff] Jan 16 21:15:57.044883 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Jan 16 21:15:57.044980 kernel: pci 0000:00:02.1: bridge window [mem 0x83e00000-0x83ffffff] Jan 16 21:15:57.045078 kernel: pci 0000:00:02.1: bridge window [mem 0x380800000000-0x380fffffffff 64bit pref] Jan 16 21:15:57.045185 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 21:15:57.045294 kernel: pci 0000:00:02.2: BAR 0 [mem 0x8439b000-0x8439bfff] Jan 16 21:15:57.045390 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Jan 16 21:15:57.045486 kernel: pci 0000:00:02.2: bridge window [mem 0x83c00000-0x83dfffff] Jan 16 21:15:57.045589 kernel: pci 0000:00:02.2: bridge window [mem 0x381000000000-0x3817ffffffff 64bit pref] Jan 16 21:15:57.045707 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 21:15:57.045809 kernel: pci 0000:00:02.3: BAR 0 [mem 0x8439a000-0x8439afff] Jan 16 21:15:57.045907 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Jan 16 21:15:57.046005 kernel: pci 0000:00:02.3: bridge window [mem 0x83a00000-0x83bfffff] Jan 16 21:15:57.046102 kernel: pci 0000:00:02.3: bridge window [mem 0x381800000000-0x381fffffffff 64bit pref] Jan 16 21:15:57.046204 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 21:15:57.046305 kernel: pci 0000:00:02.4: BAR 0 [mem 0x84399000-0x84399fff] Jan 16 21:15:57.046402 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Jan 16 21:15:57.046499 kernel: pci 0000:00:02.4: bridge window [mem 0x83800000-0x839fffff] Jan 16 21:15:57.046776 kernel: pci 0000:00:02.4: bridge window [mem 0x382000000000-0x3827ffffffff 64bit pref] Jan 16 21:15:57.046897 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 21:15:57.046998 kernel: pci 0000:00:02.5: BAR 0 [mem 0x84398000-0x84398fff] Jan 16 21:15:57.047099 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Jan 16 21:15:57.047196 kernel: pci 0000:00:02.5: bridge window [mem 0x83600000-0x837fffff] Jan 16 21:15:57.047292 kernel: pci 0000:00:02.5: bridge window [mem 0x382800000000-0x382fffffffff 64bit pref] Jan 16 21:15:57.047395 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 21:15:57.047494 kernel: pci 0000:00:02.6: BAR 0 [mem 0x84397000-0x84397fff] Jan 16 21:15:57.047593 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Jan 16 21:15:57.047702 kernel: pci 0000:00:02.6: bridge window [mem 0x83400000-0x835fffff] Jan 16 21:15:57.047799 kernel: pci 0000:00:02.6: bridge window [mem 0x383000000000-0x3837ffffffff 64bit pref] Jan 16 21:15:57.047904 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 21:15:57.048001 kernel: pci 0000:00:02.7: BAR 0 [mem 0x84396000-0x84396fff] Jan 16 21:15:57.048097 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Jan 16 21:15:57.048197 kernel: pci 0000:00:02.7: bridge window [mem 0x83200000-0x833fffff] Jan 16 21:15:57.048295 kernel: pci 0000:00:02.7: bridge window [mem 0x383800000000-0x383fffffffff 64bit pref] Jan 16 21:15:57.048398 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 21:15:57.049430 kernel: pci 0000:00:03.0: BAR 0 [mem 0x84395000-0x84395fff] Jan 16 21:15:57.049528 kernel: pci 0000:00:03.0: PCI bridge to [bus 0a] Jan 16 21:15:57.049646 kernel: pci 0000:00:03.0: bridge window [mem 0x83000000-0x831fffff] Jan 16 21:15:57.049748 kernel: pci 0000:00:03.0: bridge window [mem 0x384000000000-0x3847ffffffff 64bit pref] Jan 16 21:15:57.049854 kernel: pci 0000:00:03.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 21:15:57.049967 kernel: pci 0000:00:03.1: BAR 0 [mem 0x84394000-0x84394fff] Jan 16 21:15:57.050065 kernel: pci 0000:00:03.1: PCI bridge to [bus 0b] Jan 16 21:15:57.050163 kernel: pci 0000:00:03.1: bridge window [mem 0x82e00000-0x82ffffff] Jan 16 21:15:57.050260 kernel: pci 0000:00:03.1: bridge window [mem 0x384800000000-0x384fffffffff 64bit pref] Jan 16 21:15:57.050364 kernel: pci 0000:00:03.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 21:15:57.050462 kernel: pci 0000:00:03.2: BAR 0 [mem 0x84393000-0x84393fff] Jan 16 21:15:57.050557 kernel: pci 0000:00:03.2: PCI bridge to [bus 0c] Jan 16 21:15:57.050670 kernel: pci 0000:00:03.2: bridge window [mem 0x82c00000-0x82dfffff] Jan 16 21:15:57.050767 kernel: pci 0000:00:03.2: bridge window [mem 0x385000000000-0x3857ffffffff 64bit pref] Jan 16 21:15:57.050873 kernel: pci 0000:00:03.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 21:15:57.051948 kernel: pci 0000:00:03.3: BAR 0 [mem 0x84392000-0x84392fff] Jan 16 21:15:57.052064 kernel: pci 0000:00:03.3: PCI bridge to [bus 0d] Jan 16 21:15:57.052162 kernel: pci 0000:00:03.3: bridge window [mem 0x82a00000-0x82bfffff] Jan 16 21:15:57.052258 kernel: pci 0000:00:03.3: bridge window [mem 0x385800000000-0x385fffffffff 64bit pref] Jan 16 21:15:57.052364 kernel: pci 0000:00:03.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 21:15:57.052462 kernel: pci 0000:00:03.4: BAR 0 [mem 0x84391000-0x84391fff] Jan 16 21:15:57.052557 kernel: pci 0000:00:03.4: PCI bridge to [bus 0e] Jan 16 21:15:57.052674 kernel: pci 0000:00:03.4: bridge window [mem 0x82800000-0x829fffff] Jan 16 21:15:57.052772 kernel: pci 0000:00:03.4: bridge window [mem 0x386000000000-0x3867ffffffff 64bit pref] Jan 16 21:15:57.052874 kernel: pci 0000:00:03.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 21:15:57.052974 kernel: pci 0000:00:03.5: BAR 0 [mem 0x84390000-0x84390fff] Jan 16 21:15:57.053071 kernel: pci 0000:00:03.5: PCI bridge to [bus 0f] Jan 16 21:15:57.053166 kernel: pci 0000:00:03.5: bridge window [mem 0x82600000-0x827fffff] Jan 16 21:15:57.053275 kernel: pci 0000:00:03.5: bridge window [mem 0x386800000000-0x386fffffffff 64bit pref] Jan 16 21:15:57.053377 kernel: pci 0000:00:03.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 21:15:57.053478 kernel: pci 0000:00:03.6: BAR 0 [mem 0x8438f000-0x8438ffff] Jan 16 21:15:57.053574 kernel: pci 0000:00:03.6: PCI bridge to [bus 10] Jan 16 21:15:57.056747 kernel: pci 0000:00:03.6: bridge window [mem 0x82400000-0x825fffff] Jan 16 21:15:57.056869 kernel: pci 0000:00:03.6: bridge window [mem 0x387000000000-0x3877ffffffff 64bit pref] Jan 16 21:15:57.056978 kernel: pci 0000:00:03.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 21:15:57.057081 kernel: pci 0000:00:03.7: BAR 0 [mem 0x8438e000-0x8438efff] Jan 16 21:15:57.057186 kernel: pci 0000:00:03.7: PCI bridge to [bus 11] Jan 16 21:15:57.057296 kernel: pci 0000:00:03.7: bridge window [mem 0x82200000-0x823fffff] Jan 16 21:15:57.057396 kernel: pci 0000:00:03.7: bridge window [mem 0x387800000000-0x387fffffffff 64bit pref] Jan 16 21:15:57.057501 kernel: pci 0000:00:04.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 21:15:57.057612 kernel: pci 0000:00:04.0: BAR 0 [mem 0x8438d000-0x8438dfff] Jan 16 21:15:57.057712 kernel: pci 0000:00:04.0: PCI bridge to [bus 12] Jan 16 21:15:57.057813 kernel: pci 0000:00:04.0: bridge window [mem 0x82000000-0x821fffff] Jan 16 21:15:57.057911 kernel: pci 0000:00:04.0: bridge window [mem 0x388000000000-0x3887ffffffff 64bit pref] Jan 16 21:15:57.058014 kernel: pci 0000:00:04.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 21:15:57.058112 kernel: pci 0000:00:04.1: BAR 0 [mem 0x8438c000-0x8438cfff] Jan 16 21:15:57.058208 kernel: pci 0000:00:04.1: PCI bridge to [bus 13] Jan 16 21:15:57.058304 kernel: pci 0000:00:04.1: bridge window [mem 0x81e00000-0x81ffffff] Jan 16 21:15:57.058405 kernel: pci 0000:00:04.1: bridge window [mem 0x388800000000-0x388fffffffff 64bit pref] Jan 16 21:15:57.058506 kernel: pci 0000:00:04.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 21:15:57.059022 kernel: pci 0000:00:04.2: BAR 0 [mem 0x8438b000-0x8438bfff] Jan 16 21:15:57.059338 kernel: pci 0000:00:04.2: PCI bridge to [bus 14] Jan 16 21:15:57.059440 kernel: pci 0000:00:04.2: bridge window [mem 0x81c00000-0x81dfffff] Jan 16 21:15:57.059540 kernel: pci 0000:00:04.2: bridge window [mem 0x389000000000-0x3897ffffffff 64bit pref] Jan 16 21:15:57.059666 kernel: pci 0000:00:04.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 21:15:57.059768 kernel: pci 0000:00:04.3: BAR 0 [mem 0x8438a000-0x8438afff] Jan 16 21:15:57.059870 kernel: pci 0000:00:04.3: PCI bridge to [bus 15] Jan 16 21:15:57.063695 kernel: pci 0000:00:04.3: bridge window [mem 0x81a00000-0x81bfffff] Jan 16 21:15:57.063827 kernel: pci 0000:00:04.3: bridge window [mem 0x389800000000-0x389fffffffff 64bit pref] Jan 16 21:15:57.063938 kernel: pci 0000:00:04.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 21:15:57.064042 kernel: pci 0000:00:04.4: BAR 0 [mem 0x84389000-0x84389fff] Jan 16 21:15:57.064138 kernel: pci 0000:00:04.4: PCI bridge to [bus 16] Jan 16 21:15:57.064234 kernel: pci 0000:00:04.4: bridge window [mem 0x81800000-0x819fffff] Jan 16 21:15:57.064330 kernel: pci 0000:00:04.4: bridge window [mem 0x38a000000000-0x38a7ffffffff 64bit pref] Jan 16 21:15:57.064434 kernel: pci 0000:00:04.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 21:15:57.064532 kernel: pci 0000:00:04.5: BAR 0 [mem 0x84388000-0x84388fff] Jan 16 21:15:57.066563 kernel: pci 0000:00:04.5: PCI bridge to [bus 17] Jan 16 21:15:57.066703 kernel: pci 0000:00:04.5: bridge window [mem 0x81600000-0x817fffff] Jan 16 21:15:57.066802 kernel: pci 0000:00:04.5: bridge window [mem 0x38a800000000-0x38afffffffff 64bit pref] Jan 16 21:15:57.066906 kernel: pci 0000:00:04.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 21:15:57.067009 kernel: pci 0000:00:04.6: BAR 0 [mem 0x84387000-0x84387fff] Jan 16 21:15:57.067104 kernel: pci 0000:00:04.6: PCI bridge to [bus 18] Jan 16 21:15:57.067200 kernel: pci 0000:00:04.6: bridge window [mem 0x81400000-0x815fffff] Jan 16 21:15:57.067295 kernel: pci 0000:00:04.6: bridge window [mem 0x38b000000000-0x38b7ffffffff 64bit pref] Jan 16 21:15:57.067396 kernel: pci 0000:00:04.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 21:15:57.067495 kernel: pci 0000:00:04.7: BAR 0 [mem 0x84386000-0x84386fff] Jan 16 21:15:57.067591 kernel: pci 0000:00:04.7: PCI bridge to [bus 19] Jan 16 21:15:57.067711 kernel: pci 0000:00:04.7: bridge window [mem 0x81200000-0x813fffff] Jan 16 21:15:57.067807 kernel: pci 0000:00:04.7: bridge window [mem 0x38b800000000-0x38bfffffffff 64bit pref] Jan 16 21:15:57.067911 kernel: pci 0000:00:05.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 21:15:57.068007 kernel: pci 0000:00:05.0: BAR 0 [mem 0x84385000-0x84385fff] Jan 16 21:15:57.068106 kernel: pci 0000:00:05.0: PCI bridge to [bus 1a] Jan 16 21:15:57.068201 kernel: pci 0000:00:05.0: bridge window [mem 0x81000000-0x811fffff] Jan 16 21:15:57.068297 kernel: pci 0000:00:05.0: bridge window [mem 0x38c000000000-0x38c7ffffffff 64bit pref] Jan 16 21:15:57.068397 kernel: pci 0000:00:05.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 21:15:57.068492 kernel: pci 0000:00:05.1: BAR 0 [mem 0x84384000-0x84384fff] Jan 16 21:15:57.068587 kernel: pci 0000:00:05.1: PCI bridge to [bus 1b] Jan 16 21:15:57.070783 kernel: pci 0000:00:05.1: bridge window [mem 0x80e00000-0x80ffffff] Jan 16 21:15:57.070889 kernel: pci 0000:00:05.1: bridge window [mem 0x38c800000000-0x38cfffffffff 64bit pref] Jan 16 21:15:57.070996 kernel: pci 0000:00:05.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 21:15:57.071093 kernel: pci 0000:00:05.2: BAR 0 [mem 0x84383000-0x84383fff] Jan 16 21:15:57.071191 kernel: pci 0000:00:05.2: PCI bridge to [bus 1c] Jan 16 21:15:57.071287 kernel: pci 0000:00:05.2: bridge window [mem 0x80c00000-0x80dfffff] Jan 16 21:15:57.071388 kernel: pci 0000:00:05.2: bridge window [mem 0x38d000000000-0x38d7ffffffff 64bit pref] Jan 16 21:15:57.071493 kernel: pci 0000:00:05.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 21:15:57.071591 kernel: pci 0000:00:05.3: BAR 0 [mem 0x84382000-0x84382fff] Jan 16 21:15:57.072772 kernel: pci 0000:00:05.3: PCI bridge to [bus 1d] Jan 16 21:15:57.072906 kernel: pci 0000:00:05.3: bridge window [mem 0x80a00000-0x80bfffff] Jan 16 21:15:57.073003 kernel: pci 0000:00:05.3: bridge window [mem 0x38d800000000-0x38dfffffffff 64bit pref] Jan 16 21:15:57.073114 kernel: pci 0000:00:05.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 21:15:57.073224 kernel: pci 0000:00:05.4: BAR 0 [mem 0x84381000-0x84381fff] Jan 16 21:15:57.073322 kernel: pci 0000:00:05.4: PCI bridge to [bus 1e] Jan 16 21:15:57.073417 kernel: pci 0000:00:05.4: bridge window [mem 0x80800000-0x809fffff] Jan 16 21:15:57.073512 kernel: pci 0000:00:05.4: bridge window [mem 0x38e000000000-0x38e7ffffffff 64bit pref] Jan 16 21:15:57.073651 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint Jan 16 21:15:57.073753 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Jan 16 21:15:57.073855 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint Jan 16 21:15:57.073956 kernel: pci 0000:00:1f.2: BAR 4 [io 0x7040-0x705f] Jan 16 21:15:57.074051 kernel: pci 0000:00:1f.2: BAR 5 [mem 0x84380000-0x84380fff] Jan 16 21:15:57.074151 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint Jan 16 21:15:57.074249 kernel: pci 0000:00:1f.3: BAR 4 [io 0x7000-0x703f] Jan 16 21:15:57.074357 kernel: pci 0000:01:00.0: [1b36:000e] type 01 class 0x060400 PCIe to PCI/PCI-X bridge Jan 16 21:15:57.074456 kernel: pci 0000:01:00.0: BAR 0 [mem 0x84200000-0x842000ff 64bit] Jan 16 21:15:57.074554 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Jan 16 21:15:57.079088 kernel: pci 0000:01:00.0: bridge window [io 0x6000-0x6fff] Jan 16 21:15:57.079215 kernel: pci 0000:01:00.0: bridge window [mem 0x84000000-0x841fffff] Jan 16 21:15:57.079327 kernel: pci 0000:01:00.0: bridge window [mem 0x380000000000-0x3807ffffffff 64bit pref] Jan 16 21:15:57.079429 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Jan 16 21:15:57.079539 kernel: pci_bus 0000:02: extended config space not accessible Jan 16 21:15:57.079552 kernel: acpiphp: Slot [1] registered Jan 16 21:15:57.079562 kernel: acpiphp: Slot [0] registered Jan 16 21:15:57.079570 kernel: acpiphp: Slot [2] registered Jan 16 21:15:57.079582 kernel: acpiphp: Slot [3] registered Jan 16 21:15:57.079591 kernel: acpiphp: Slot [4] registered Jan 16 21:15:57.079610 kernel: acpiphp: Slot [5] registered Jan 16 21:15:57.079619 kernel: acpiphp: Slot [6] registered Jan 16 21:15:57.079627 kernel: acpiphp: Slot [7] registered Jan 16 21:15:57.079635 kernel: acpiphp: Slot [8] registered Jan 16 21:15:57.079644 kernel: acpiphp: Slot [9] registered Jan 16 21:15:57.079653 kernel: acpiphp: Slot [10] registered Jan 16 21:15:57.079664 kernel: acpiphp: Slot [11] registered Jan 16 21:15:57.079674 kernel: acpiphp: Slot [12] registered Jan 16 21:15:57.079682 kernel: acpiphp: Slot [13] registered Jan 16 21:15:57.079690 kernel: acpiphp: Slot [14] registered Jan 16 21:15:57.079699 kernel: acpiphp: Slot [15] registered Jan 16 21:15:57.079708 kernel: acpiphp: Slot [16] registered Jan 16 21:15:57.079716 kernel: acpiphp: Slot [17] registered Jan 16 21:15:57.079727 kernel: acpiphp: Slot [18] registered Jan 16 21:15:57.079735 kernel: acpiphp: Slot [19] registered Jan 16 21:15:57.079744 kernel: acpiphp: Slot [20] registered Jan 16 21:15:57.079752 kernel: acpiphp: Slot [21] registered Jan 16 21:15:57.079761 kernel: acpiphp: Slot [22] registered Jan 16 21:15:57.079769 kernel: acpiphp: Slot [23] registered Jan 16 21:15:57.079778 kernel: acpiphp: Slot [24] registered Jan 16 21:15:57.079786 kernel: acpiphp: Slot [25] registered Jan 16 21:15:57.079797 kernel: acpiphp: Slot [26] registered Jan 16 21:15:57.079805 kernel: acpiphp: Slot [27] registered Jan 16 21:15:57.079814 kernel: acpiphp: Slot [28] registered Jan 16 21:15:57.079822 kernel: acpiphp: Slot [29] registered Jan 16 21:15:57.079830 kernel: acpiphp: Slot [30] registered Jan 16 21:15:57.079839 kernel: acpiphp: Slot [31] registered Jan 16 21:15:57.079956 kernel: pci 0000:02:01.0: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint Jan 16 21:15:57.080065 kernel: pci 0000:02:01.0: BAR 4 [io 0x6000-0x601f] Jan 16 21:15:57.080165 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Jan 16 21:15:57.080175 kernel: acpiphp: Slot [0-2] registered Jan 16 21:15:57.080279 kernel: pci 0000:03:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Jan 16 21:15:57.080380 kernel: pci 0000:03:00.0: BAR 1 [mem 0x83e00000-0x83e00fff] Jan 16 21:15:57.080481 kernel: pci 0000:03:00.0: BAR 4 [mem 0x380800000000-0x380800003fff 64bit pref] Jan 16 21:15:57.080583 kernel: pci 0000:03:00.0: ROM [mem 0xfff80000-0xffffffff pref] Jan 16 21:15:57.082163 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Jan 16 21:15:57.082183 kernel: acpiphp: Slot [0-3] registered Jan 16 21:15:57.082293 kernel: pci 0000:04:00.0: [1af4:1042] type 00 class 0x010000 PCIe Endpoint Jan 16 21:15:57.082396 kernel: pci 0000:04:00.0: BAR 1 [mem 0x83c00000-0x83c00fff] Jan 16 21:15:57.082496 kernel: pci 0000:04:00.0: BAR 4 [mem 0x381000000000-0x381000003fff 64bit pref] Jan 16 21:15:57.083963 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Jan 16 21:15:57.083986 kernel: acpiphp: Slot [0-4] registered Jan 16 21:15:57.084130 kernel: pci 0000:05:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Jan 16 21:15:57.084236 kernel: pci 0000:05:00.0: BAR 4 [mem 0x381800000000-0x381800003fff 64bit pref] Jan 16 21:15:57.084338 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Jan 16 21:15:57.084352 kernel: acpiphp: Slot [0-5] registered Jan 16 21:15:57.084469 kernel: pci 0000:06:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Jan 16 21:15:57.084570 kernel: pci 0000:06:00.0: BAR 1 [mem 0x83800000-0x83800fff] Jan 16 21:15:57.084692 kernel: pci 0000:06:00.0: BAR 4 [mem 0x382000000000-0x382000003fff 64bit pref] Jan 16 21:15:57.084819 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Jan 16 21:15:57.084832 kernel: acpiphp: Slot [0-6] registered Jan 16 21:15:57.084941 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Jan 16 21:15:57.084956 kernel: acpiphp: Slot [0-7] registered Jan 16 21:15:57.085054 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Jan 16 21:15:57.085066 kernel: acpiphp: Slot [0-8] registered Jan 16 21:15:57.085162 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Jan 16 21:15:57.085174 kernel: acpiphp: Slot [0-9] registered Jan 16 21:15:57.085284 kernel: pci 0000:00:03.0: PCI bridge to [bus 0a] Jan 16 21:15:57.085298 kernel: acpiphp: Slot [0-10] registered Jan 16 21:15:57.085397 kernel: pci 0000:00:03.1: PCI bridge to [bus 0b] Jan 16 21:15:57.085409 kernel: acpiphp: Slot [0-11] registered Jan 16 21:15:57.085505 kernel: pci 0000:00:03.2: PCI bridge to [bus 0c] Jan 16 21:15:57.085517 kernel: acpiphp: Slot [0-12] registered Jan 16 21:15:57.085624 kernel: pci 0000:00:03.3: PCI bridge to [bus 0d] Jan 16 21:15:57.085636 kernel: acpiphp: Slot [0-13] registered Jan 16 21:15:57.085736 kernel: pci 0000:00:03.4: PCI bridge to [bus 0e] Jan 16 21:15:57.085747 kernel: acpiphp: Slot [0-14] registered Jan 16 21:15:57.085845 kernel: pci 0000:00:03.5: PCI bridge to [bus 0f] Jan 16 21:15:57.085856 kernel: acpiphp: Slot [0-15] registered Jan 16 21:15:57.085958 kernel: pci 0000:00:03.6: PCI bridge to [bus 10] Jan 16 21:15:57.085970 kernel: acpiphp: Slot [0-16] registered Jan 16 21:15:57.086069 kernel: pci 0000:00:03.7: PCI bridge to [bus 11] Jan 16 21:15:57.086081 kernel: acpiphp: Slot [0-17] registered Jan 16 21:15:57.086178 kernel: pci 0000:00:04.0: PCI bridge to [bus 12] Jan 16 21:15:57.086189 kernel: acpiphp: Slot [0-18] registered Jan 16 21:15:57.086286 kernel: pci 0000:00:04.1: PCI bridge to [bus 13] Jan 16 21:15:57.086298 kernel: acpiphp: Slot [0-19] registered Jan 16 21:15:57.086396 kernel: pci 0000:00:04.2: PCI bridge to [bus 14] Jan 16 21:15:57.086407 kernel: acpiphp: Slot [0-20] registered Jan 16 21:15:57.086502 kernel: pci 0000:00:04.3: PCI bridge to [bus 15] Jan 16 21:15:57.086513 kernel: acpiphp: Slot [0-21] registered Jan 16 21:15:57.086639 kernel: pci 0000:00:04.4: PCI bridge to [bus 16] Jan 16 21:15:57.086650 kernel: acpiphp: Slot [0-22] registered Jan 16 21:15:57.086750 kernel: pci 0000:00:04.5: PCI bridge to [bus 17] Jan 16 21:15:57.086765 kernel: acpiphp: Slot [0-23] registered Jan 16 21:15:57.086863 kernel: pci 0000:00:04.6: PCI bridge to [bus 18] Jan 16 21:15:57.086874 kernel: acpiphp: Slot [0-24] registered Jan 16 21:15:57.086972 kernel: pci 0000:00:04.7: PCI bridge to [bus 19] Jan 16 21:15:57.086983 kernel: acpiphp: Slot [0-25] registered Jan 16 21:15:57.087080 kernel: pci 0000:00:05.0: PCI bridge to [bus 1a] Jan 16 21:15:57.087094 kernel: acpiphp: Slot [0-26] registered Jan 16 21:15:57.087189 kernel: pci 0000:00:05.1: PCI bridge to [bus 1b] Jan 16 21:15:57.087201 kernel: acpiphp: Slot [0-27] registered Jan 16 21:15:57.087297 kernel: pci 0000:00:05.2: PCI bridge to [bus 1c] Jan 16 21:15:57.087309 kernel: acpiphp: Slot [0-28] registered Jan 16 21:15:57.087407 kernel: pci 0000:00:05.3: PCI bridge to [bus 1d] Jan 16 21:15:57.087418 kernel: acpiphp: Slot [0-29] registered Jan 16 21:15:57.087518 kernel: pci 0000:00:05.4: PCI bridge to [bus 1e] Jan 16 21:15:57.087530 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Jan 16 21:15:57.087539 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Jan 16 21:15:57.087548 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Jan 16 21:15:57.087556 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Jan 16 21:15:57.087565 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Jan 16 21:15:57.087576 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Jan 16 21:15:57.087585 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Jan 16 21:15:57.087593 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Jan 16 21:15:57.087609 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Jan 16 21:15:57.087618 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Jan 16 21:15:57.087626 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Jan 16 21:15:57.087639 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Jan 16 21:15:57.087655 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Jan 16 21:15:57.087663 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Jan 16 21:15:57.087672 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Jan 16 21:15:57.087681 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Jan 16 21:15:57.087689 kernel: iommu: Default domain type: Translated Jan 16 21:15:57.087697 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Jan 16 21:15:57.087706 kernel: efivars: Registered efivars operations Jan 16 21:15:57.087714 kernel: PCI: Using ACPI for IRQ routing Jan 16 21:15:57.087725 kernel: PCI: pci_cache_line_size set to 64 bytes Jan 16 21:15:57.087734 kernel: e820: reserve RAM buffer [mem 0x0080b000-0x008fffff] Jan 16 21:15:57.087742 kernel: e820: reserve RAM buffer [mem 0x00811000-0x008fffff] Jan 16 21:15:57.087750 kernel: e820: reserve RAM buffer [mem 0x7df57018-0x7fffffff] Jan 16 21:15:57.087759 kernel: e820: reserve RAM buffer [mem 0x7df7f018-0x7fffffff] Jan 16 21:15:57.087767 kernel: e820: reserve RAM buffer [mem 0x7e93f000-0x7fffffff] Jan 16 21:15:57.087775 kernel: e820: reserve RAM buffer [mem 0x7ec71000-0x7fffffff] Jan 16 21:15:57.087786 kernel: e820: reserve RAM buffer [mem 0x7f8ed000-0x7fffffff] Jan 16 21:15:57.087795 kernel: e820: reserve RAM buffer [mem 0x7feaf000-0x7fffffff] Jan 16 21:15:57.087804 kernel: e820: reserve RAM buffer [mem 0x7feec000-0x7fffffff] Jan 16 21:15:57.087909 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Jan 16 21:15:57.088007 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Jan 16 21:15:57.088105 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Jan 16 21:15:57.088115 kernel: vgaarb: loaded Jan 16 21:15:57.088126 kernel: clocksource: Switched to clocksource kvm-clock Jan 16 21:15:57.088135 kernel: VFS: Disk quotas dquot_6.6.0 Jan 16 21:15:57.088144 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Jan 16 21:15:57.088152 kernel: pnp: PnP ACPI init Jan 16 21:15:57.088260 kernel: system 00:04: [mem 0xe0000000-0xefffffff window] has been reserved Jan 16 21:15:57.088273 kernel: pnp: PnP ACPI: found 5 devices Jan 16 21:15:57.088284 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Jan 16 21:15:57.088293 kernel: NET: Registered PF_INET protocol family Jan 16 21:15:57.088301 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Jan 16 21:15:57.088310 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Jan 16 21:15:57.088319 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Jan 16 21:15:57.088327 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Jan 16 21:15:57.088336 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Jan 16 21:15:57.088347 kernel: TCP: Hash tables configured (established 32768 bind 32768) Jan 16 21:15:57.088355 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Jan 16 21:15:57.088364 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Jan 16 21:15:57.088372 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Jan 16 21:15:57.088381 kernel: NET: Registered PF_XDP protocol family Jan 16 21:15:57.088488 kernel: pci 0000:03:00.0: ROM [mem 0xfff80000-0xffffffff pref]: can't claim; no compatible bridge window Jan 16 21:15:57.088590 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Jan 16 21:15:57.088710 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Jan 16 21:15:57.088810 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Jan 16 21:15:57.088909 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Jan 16 21:15:57.089018 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Jan 16 21:15:57.089119 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Jan 16 21:15:57.089228 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Jan 16 21:15:57.089332 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Jan 16 21:15:57.089430 kernel: pci 0000:00:03.1: bridge window [io 0x1000-0x0fff] to [bus 0b] add_size 1000 Jan 16 21:15:57.089528 kernel: pci 0000:00:03.2: bridge window [io 0x1000-0x0fff] to [bus 0c] add_size 1000 Jan 16 21:15:57.089635 kernel: pci 0000:00:03.3: bridge window [io 0x1000-0x0fff] to [bus 0d] add_size 1000 Jan 16 21:15:57.089734 kernel: pci 0000:00:03.4: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Jan 16 21:15:57.089829 kernel: pci 0000:00:03.5: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Jan 16 21:15:57.089927 kernel: pci 0000:00:03.6: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Jan 16 21:15:57.090028 kernel: pci 0000:00:03.7: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Jan 16 21:15:57.090126 kernel: pci 0000:00:04.0: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Jan 16 21:15:57.090221 kernel: pci 0000:00:04.1: bridge window [io 0x1000-0x0fff] to [bus 13] add_size 1000 Jan 16 21:15:57.090318 kernel: pci 0000:00:04.2: bridge window [io 0x1000-0x0fff] to [bus 14] add_size 1000 Jan 16 21:15:57.090414 kernel: pci 0000:00:04.3: bridge window [io 0x1000-0x0fff] to [bus 15] add_size 1000 Jan 16 21:15:57.090511 kernel: pci 0000:00:04.4: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Jan 16 21:15:57.090630 kernel: pci 0000:00:04.5: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Jan 16 21:15:57.090729 kernel: pci 0000:00:04.6: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Jan 16 21:15:57.090826 kernel: pci 0000:00:04.7: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Jan 16 21:15:57.090922 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Jan 16 21:15:57.091019 kernel: pci 0000:00:05.1: bridge window [io 0x1000-0x0fff] to [bus 1b] add_size 1000 Jan 16 21:15:57.091115 kernel: pci 0000:00:05.2: bridge window [io 0x1000-0x0fff] to [bus 1c] add_size 1000 Jan 16 21:15:57.091215 kernel: pci 0000:00:05.3: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Jan 16 21:15:57.091310 kernel: pci 0000:00:05.4: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Jan 16 21:15:57.091405 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x1fff]: assigned Jan 16 21:15:57.091501 kernel: pci 0000:00:02.2: bridge window [io 0x2000-0x2fff]: assigned Jan 16 21:15:57.091604 kernel: pci 0000:00:02.3: bridge window [io 0x3000-0x3fff]: assigned Jan 16 21:15:57.091718 kernel: pci 0000:00:02.4: bridge window [io 0x4000-0x4fff]: assigned Jan 16 21:15:57.091815 kernel: pci 0000:00:02.5: bridge window [io 0x5000-0x5fff]: assigned Jan 16 21:15:57.091914 kernel: pci 0000:00:02.6: bridge window [io 0x8000-0x8fff]: assigned Jan 16 21:15:57.092010 kernel: pci 0000:00:02.7: bridge window [io 0x9000-0x9fff]: assigned Jan 16 21:15:57.092106 kernel: pci 0000:00:03.0: bridge window [io 0xa000-0xafff]: assigned Jan 16 21:15:57.095461 kernel: pci 0000:00:03.1: bridge window [io 0xb000-0xbfff]: assigned Jan 16 21:15:57.095584 kernel: pci 0000:00:03.2: bridge window [io 0xc000-0xcfff]: assigned Jan 16 21:15:57.095695 kernel: pci 0000:00:03.3: bridge window [io 0xd000-0xdfff]: assigned Jan 16 21:15:57.095798 kernel: pci 0000:00:03.4: bridge window [io 0xe000-0xefff]: assigned Jan 16 21:15:57.095894 kernel: pci 0000:00:03.5: bridge window [io 0xf000-0xffff]: assigned Jan 16 21:15:57.095991 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: can't assign; no space Jan 16 21:15:57.096087 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: failed to assign Jan 16 21:15:57.096184 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: can't assign; no space Jan 16 21:15:57.096279 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: failed to assign Jan 16 21:15:57.096376 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: can't assign; no space Jan 16 21:15:57.096474 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: failed to assign Jan 16 21:15:57.096571 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: can't assign; no space Jan 16 21:15:57.096675 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: failed to assign Jan 16 21:15:57.096771 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: can't assign; no space Jan 16 21:15:57.096866 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: failed to assign Jan 16 21:15:57.096961 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: can't assign; no space Jan 16 21:15:57.097059 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: failed to assign Jan 16 21:15:57.097157 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: can't assign; no space Jan 16 21:15:57.097287 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: failed to assign Jan 16 21:15:57.097385 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: can't assign; no space Jan 16 21:15:57.097480 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: failed to assign Jan 16 21:15:57.097577 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: can't assign; no space Jan 16 21:15:57.097687 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: failed to assign Jan 16 21:15:57.097785 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: can't assign; no space Jan 16 21:15:57.097882 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: failed to assign Jan 16 21:15:57.097979 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: can't assign; no space Jan 16 21:15:57.098075 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: failed to assign Jan 16 21:15:57.098173 kernel: pci 0000:00:05.1: bridge window [io size 0x1000]: can't assign; no space Jan 16 21:15:57.098269 kernel: pci 0000:00:05.1: bridge window [io size 0x1000]: failed to assign Jan 16 21:15:57.098368 kernel: pci 0000:00:05.2: bridge window [io size 0x1000]: can't assign; no space Jan 16 21:15:57.098464 kernel: pci 0000:00:05.2: bridge window [io size 0x1000]: failed to assign Jan 16 21:15:57.098560 kernel: pci 0000:00:05.3: bridge window [io size 0x1000]: can't assign; no space Jan 16 21:15:57.098672 kernel: pci 0000:00:05.3: bridge window [io size 0x1000]: failed to assign Jan 16 21:15:57.098769 kernel: pci 0000:00:05.4: bridge window [io size 0x1000]: can't assign; no space Jan 16 21:15:57.098864 kernel: pci 0000:00:05.4: bridge window [io size 0x1000]: failed to assign Jan 16 21:15:57.098964 kernel: pci 0000:00:05.4: bridge window [io 0x1000-0x1fff]: assigned Jan 16 21:15:57.099060 kernel: pci 0000:00:05.3: bridge window [io 0x2000-0x2fff]: assigned Jan 16 21:15:57.099157 kernel: pci 0000:00:05.2: bridge window [io 0x3000-0x3fff]: assigned Jan 16 21:15:57.099255 kernel: pci 0000:00:05.1: bridge window [io 0x4000-0x4fff]: assigned Jan 16 21:15:57.099354 kernel: pci 0000:00:05.0: bridge window [io 0x5000-0x5fff]: assigned Jan 16 21:15:57.099449 kernel: pci 0000:00:04.7: bridge window [io 0x8000-0x8fff]: assigned Jan 16 21:15:57.099547 kernel: pci 0000:00:04.6: bridge window [io 0x9000-0x9fff]: assigned Jan 16 21:15:57.099661 kernel: pci 0000:00:04.5: bridge window [io 0xa000-0xafff]: assigned Jan 16 21:15:57.099759 kernel: pci 0000:00:04.4: bridge window [io 0xb000-0xbfff]: assigned Jan 16 21:15:57.099857 kernel: pci 0000:00:04.3: bridge window [io 0xc000-0xcfff]: assigned Jan 16 21:15:57.099957 kernel: pci 0000:00:04.2: bridge window [io 0xd000-0xdfff]: assigned Jan 16 21:15:57.100053 kernel: pci 0000:00:04.1: bridge window [io 0xe000-0xefff]: assigned Jan 16 21:15:57.100150 kernel: pci 0000:00:04.0: bridge window [io 0xf000-0xffff]: assigned Jan 16 21:15:57.100253 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: can't assign; no space Jan 16 21:15:57.100350 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: failed to assign Jan 16 21:15:57.100448 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: can't assign; no space Jan 16 21:15:57.100545 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: failed to assign Jan 16 21:15:57.100653 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: can't assign; no space Jan 16 21:15:57.100750 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: failed to assign Jan 16 21:15:57.100849 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: can't assign; no space Jan 16 21:15:57.100950 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: failed to assign Jan 16 21:15:57.101047 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: can't assign; no space Jan 16 21:15:57.101143 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: failed to assign Jan 16 21:15:57.101263 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: can't assign; no space Jan 16 21:15:57.101360 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: failed to assign Jan 16 21:15:57.101457 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: can't assign; no space Jan 16 21:15:57.101558 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: failed to assign Jan 16 21:15:57.101665 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: can't assign; no space Jan 16 21:15:57.101761 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: failed to assign Jan 16 21:15:57.101857 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: can't assign; no space Jan 16 21:15:57.101952 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: failed to assign Jan 16 21:15:57.102052 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: can't assign; no space Jan 16 21:15:57.102149 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: failed to assign Jan 16 21:15:57.102251 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: can't assign; no space Jan 16 21:15:57.102348 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: failed to assign Jan 16 21:15:57.102444 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: can't assign; no space Jan 16 21:15:57.102540 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: failed to assign Jan 16 21:15:57.102660 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: can't assign; no space Jan 16 21:15:57.102759 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: failed to assign Jan 16 21:15:57.102862 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: can't assign; no space Jan 16 21:15:57.102959 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: failed to assign Jan 16 21:15:57.103059 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: can't assign; no space Jan 16 21:15:57.103156 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: failed to assign Jan 16 21:15:57.103258 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Jan 16 21:15:57.103356 kernel: pci 0000:01:00.0: bridge window [io 0x6000-0x6fff] Jan 16 21:15:57.103454 kernel: pci 0000:01:00.0: bridge window [mem 0x84000000-0x841fffff] Jan 16 21:15:57.103558 kernel: pci 0000:01:00.0: bridge window [mem 0x380000000000-0x3807ffffffff 64bit pref] Jan 16 21:15:57.103669 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Jan 16 21:15:57.103766 kernel: pci 0000:00:02.0: bridge window [io 0x6000-0x6fff] Jan 16 21:15:57.103871 kernel: pci 0000:00:02.0: bridge window [mem 0x84000000-0x842fffff] Jan 16 21:15:57.103967 kernel: pci 0000:00:02.0: bridge window [mem 0x380000000000-0x3807ffffffff 64bit pref] Jan 16 21:15:57.104070 kernel: pci 0000:03:00.0: ROM [mem 0x83e80000-0x83efffff pref]: assigned Jan 16 21:15:57.104167 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Jan 16 21:15:57.104264 kernel: pci 0000:00:02.1: bridge window [mem 0x83e00000-0x83ffffff] Jan 16 21:15:57.104359 kernel: pci 0000:00:02.1: bridge window [mem 0x380800000000-0x380fffffffff 64bit pref] Jan 16 21:15:57.104454 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Jan 16 21:15:57.104548 kernel: pci 0000:00:02.2: bridge window [mem 0x83c00000-0x83dfffff] Jan 16 21:15:57.104653 kernel: pci 0000:00:02.2: bridge window [mem 0x381000000000-0x3817ffffffff 64bit pref] Jan 16 21:15:57.104748 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Jan 16 21:15:57.104843 kernel: pci 0000:00:02.3: bridge window [mem 0x83a00000-0x83bfffff] Jan 16 21:15:57.104937 kernel: pci 0000:00:02.3: bridge window [mem 0x381800000000-0x381fffffffff 64bit pref] Jan 16 21:15:57.105035 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Jan 16 21:15:57.105129 kernel: pci 0000:00:02.4: bridge window [mem 0x83800000-0x839fffff] Jan 16 21:15:57.105234 kernel: pci 0000:00:02.4: bridge window [mem 0x382000000000-0x3827ffffffff 64bit pref] Jan 16 21:15:57.105329 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Jan 16 21:15:57.105424 kernel: pci 0000:00:02.5: bridge window [mem 0x83600000-0x837fffff] Jan 16 21:15:57.105519 kernel: pci 0000:00:02.5: bridge window [mem 0x382800000000-0x382fffffffff 64bit pref] Jan 16 21:15:57.105624 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Jan 16 21:15:57.105719 kernel: pci 0000:00:02.6: bridge window [mem 0x83400000-0x835fffff] Jan 16 21:15:57.105814 kernel: pci 0000:00:02.6: bridge window [mem 0x383000000000-0x3837ffffffff 64bit pref] Jan 16 21:15:57.105910 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Jan 16 21:15:57.106004 kernel: pci 0000:00:02.7: bridge window [mem 0x83200000-0x833fffff] Jan 16 21:15:57.106099 kernel: pci 0000:00:02.7: bridge window [mem 0x383800000000-0x383fffffffff 64bit pref] Jan 16 21:15:57.106197 kernel: pci 0000:00:03.0: PCI bridge to [bus 0a] Jan 16 21:15:57.106292 kernel: pci 0000:00:03.0: bridge window [mem 0x83000000-0x831fffff] Jan 16 21:15:57.106387 kernel: pci 0000:00:03.0: bridge window [mem 0x384000000000-0x3847ffffffff 64bit pref] Jan 16 21:15:57.106482 kernel: pci 0000:00:03.1: PCI bridge to [bus 0b] Jan 16 21:15:57.106576 kernel: pci 0000:00:03.1: bridge window [mem 0x82e00000-0x82ffffff] Jan 16 21:15:57.106690 kernel: pci 0000:00:03.1: bridge window [mem 0x384800000000-0x384fffffffff 64bit pref] Jan 16 21:15:57.106787 kernel: pci 0000:00:03.2: PCI bridge to [bus 0c] Jan 16 21:15:57.106880 kernel: pci 0000:00:03.2: bridge window [mem 0x82c00000-0x82dfffff] Jan 16 21:15:57.106978 kernel: pci 0000:00:03.2: bridge window [mem 0x385000000000-0x3857ffffffff 64bit pref] Jan 16 21:15:57.107074 kernel: pci 0000:00:03.3: PCI bridge to [bus 0d] Jan 16 21:15:57.107168 kernel: pci 0000:00:03.3: bridge window [mem 0x82a00000-0x82bfffff] Jan 16 21:15:57.107263 kernel: pci 0000:00:03.3: bridge window [mem 0x385800000000-0x385fffffffff 64bit pref] Jan 16 21:15:57.107361 kernel: pci 0000:00:03.4: PCI bridge to [bus 0e] Jan 16 21:15:57.107455 kernel: pci 0000:00:03.4: bridge window [mem 0x82800000-0x829fffff] Jan 16 21:15:57.107550 kernel: pci 0000:00:03.4: bridge window [mem 0x386000000000-0x3867ffffffff 64bit pref] Jan 16 21:15:57.107671 kernel: pci 0000:00:03.5: PCI bridge to [bus 0f] Jan 16 21:15:57.107776 kernel: pci 0000:00:03.5: bridge window [mem 0x82600000-0x827fffff] Jan 16 21:15:57.107891 kernel: pci 0000:00:03.5: bridge window [mem 0x386800000000-0x386fffffffff 64bit pref] Jan 16 21:15:57.107989 kernel: pci 0000:00:03.6: PCI bridge to [bus 10] Jan 16 21:15:57.108084 kernel: pci 0000:00:03.6: bridge window [mem 0x82400000-0x825fffff] Jan 16 21:15:57.108181 kernel: pci 0000:00:03.6: bridge window [mem 0x387000000000-0x3877ffffffff 64bit pref] Jan 16 21:15:57.108278 kernel: pci 0000:00:03.7: PCI bridge to [bus 11] Jan 16 21:15:57.108378 kernel: pci 0000:00:03.7: bridge window [mem 0x82200000-0x823fffff] Jan 16 21:15:57.108474 kernel: pci 0000:00:03.7: bridge window [mem 0x387800000000-0x387fffffffff 64bit pref] Jan 16 21:15:57.108573 kernel: pci 0000:00:04.0: PCI bridge to [bus 12] Jan 16 21:15:57.108691 kernel: pci 0000:00:04.0: bridge window [io 0xf000-0xffff] Jan 16 21:15:57.108789 kernel: pci 0000:00:04.0: bridge window [mem 0x82000000-0x821fffff] Jan 16 21:15:57.108886 kernel: pci 0000:00:04.0: bridge window [mem 0x388000000000-0x3887ffffffff 64bit pref] Jan 16 21:15:57.108986 kernel: pci 0000:00:04.1: PCI bridge to [bus 13] Jan 16 21:15:57.109084 kernel: pci 0000:00:04.1: bridge window [io 0xe000-0xefff] Jan 16 21:15:57.109181 kernel: pci 0000:00:04.1: bridge window [mem 0x81e00000-0x81ffffff] Jan 16 21:15:57.109287 kernel: pci 0000:00:04.1: bridge window [mem 0x388800000000-0x388fffffffff 64bit pref] Jan 16 21:15:57.109386 kernel: pci 0000:00:04.2: PCI bridge to [bus 14] Jan 16 21:15:57.109483 kernel: pci 0000:00:04.2: bridge window [io 0xd000-0xdfff] Jan 16 21:15:57.109583 kernel: pci 0000:00:04.2: bridge window [mem 0x81c00000-0x81dfffff] Jan 16 21:15:57.109697 kernel: pci 0000:00:04.2: bridge window [mem 0x389000000000-0x3897ffffffff 64bit pref] Jan 16 21:15:57.109799 kernel: pci 0000:00:04.3: PCI bridge to [bus 15] Jan 16 21:15:57.109899 kernel: pci 0000:00:04.3: bridge window [io 0xc000-0xcfff] Jan 16 21:15:57.109999 kernel: pci 0000:00:04.3: bridge window [mem 0x81a00000-0x81bfffff] Jan 16 21:15:57.110096 kernel: pci 0000:00:04.3: bridge window [mem 0x389800000000-0x389fffffffff 64bit pref] Jan 16 21:15:57.110199 kernel: pci 0000:00:04.4: PCI bridge to [bus 16] Jan 16 21:15:57.110297 kernel: pci 0000:00:04.4: bridge window [io 0xb000-0xbfff] Jan 16 21:15:57.110395 kernel: pci 0000:00:04.4: bridge window [mem 0x81800000-0x819fffff] Jan 16 21:15:57.114081 kernel: pci 0000:00:04.4: bridge window [mem 0x38a000000000-0x38a7ffffffff 64bit pref] Jan 16 21:15:57.114206 kernel: pci 0000:00:04.5: PCI bridge to [bus 17] Jan 16 21:15:57.114303 kernel: pci 0000:00:04.5: bridge window [io 0xa000-0xafff] Jan 16 21:15:57.114397 kernel: pci 0000:00:04.5: bridge window [mem 0x81600000-0x817fffff] Jan 16 21:15:57.114499 kernel: pci 0000:00:04.5: bridge window [mem 0x38a800000000-0x38afffffffff 64bit pref] Jan 16 21:15:57.114608 kernel: pci 0000:00:04.6: PCI bridge to [bus 18] Jan 16 21:15:57.114705 kernel: pci 0000:00:04.6: bridge window [io 0x9000-0x9fff] Jan 16 21:15:57.114803 kernel: pci 0000:00:04.6: bridge window [mem 0x81400000-0x815fffff] Jan 16 21:15:57.114898 kernel: pci 0000:00:04.6: bridge window [mem 0x38b000000000-0x38b7ffffffff 64bit pref] Jan 16 21:15:57.114996 kernel: pci 0000:00:04.7: PCI bridge to [bus 19] Jan 16 21:15:57.115092 kernel: pci 0000:00:04.7: bridge window [io 0x8000-0x8fff] Jan 16 21:15:57.115188 kernel: pci 0000:00:04.7: bridge window [mem 0x81200000-0x813fffff] Jan 16 21:15:57.115282 kernel: pci 0000:00:04.7: bridge window [mem 0x38b800000000-0x38bfffffffff 64bit pref] Jan 16 21:15:57.115380 kernel: pci 0000:00:05.0: PCI bridge to [bus 1a] Jan 16 21:15:57.115486 kernel: pci 0000:00:05.0: bridge window [io 0x5000-0x5fff] Jan 16 21:15:57.115583 kernel: pci 0000:00:05.0: bridge window [mem 0x81000000-0x811fffff] Jan 16 21:15:57.115696 kernel: pci 0000:00:05.0: bridge window [mem 0x38c000000000-0x38c7ffffffff 64bit pref] Jan 16 21:15:57.115794 kernel: pci 0000:00:05.1: PCI bridge to [bus 1b] Jan 16 21:15:57.115891 kernel: pci 0000:00:05.1: bridge window [io 0x4000-0x4fff] Jan 16 21:15:57.115987 kernel: pci 0000:00:05.1: bridge window [mem 0x80e00000-0x80ffffff] Jan 16 21:15:57.116083 kernel: pci 0000:00:05.1: bridge window [mem 0x38c800000000-0x38cfffffffff 64bit pref] Jan 16 21:15:57.116183 kernel: pci 0000:00:05.2: PCI bridge to [bus 1c] Jan 16 21:15:57.116283 kernel: pci 0000:00:05.2: bridge window [io 0x3000-0x3fff] Jan 16 21:15:57.116379 kernel: pci 0000:00:05.2: bridge window [mem 0x80c00000-0x80dfffff] Jan 16 21:15:57.116475 kernel: pci 0000:00:05.2: bridge window [mem 0x38d000000000-0x38d7ffffffff 64bit pref] Jan 16 21:15:57.116574 kernel: pci 0000:00:05.3: PCI bridge to [bus 1d] Jan 16 21:15:57.116691 kernel: pci 0000:00:05.3: bridge window [io 0x2000-0x2fff] Jan 16 21:15:57.116789 kernel: pci 0000:00:05.3: bridge window [mem 0x80a00000-0x80bfffff] Jan 16 21:15:57.116889 kernel: pci 0000:00:05.3: bridge window [mem 0x38d800000000-0x38dfffffffff 64bit pref] Jan 16 21:15:57.116988 kernel: pci 0000:00:05.4: PCI bridge to [bus 1e] Jan 16 21:15:57.117086 kernel: pci 0000:00:05.4: bridge window [io 0x1000-0x1fff] Jan 16 21:15:57.117183 kernel: pci 0000:00:05.4: bridge window [mem 0x80800000-0x809fffff] Jan 16 21:15:57.117307 kernel: pci 0000:00:05.4: bridge window [mem 0x38e000000000-0x38e7ffffffff 64bit pref] Jan 16 21:15:57.117404 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Jan 16 21:15:57.117496 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Jan 16 21:15:57.117583 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Jan 16 21:15:57.119749 kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xdfffffff window] Jan 16 21:15:57.119854 kernel: pci_bus 0000:00: resource 8 [mem 0xf0000000-0xfebfffff window] Jan 16 21:15:57.119944 kernel: pci_bus 0000:00: resource 9 [mem 0x380000000000-0x38e800003fff window] Jan 16 21:15:57.120046 kernel: pci_bus 0000:01: resource 0 [io 0x6000-0x6fff] Jan 16 21:15:57.120143 kernel: pci_bus 0000:01: resource 1 [mem 0x84000000-0x842fffff] Jan 16 21:15:57.120233 kernel: pci_bus 0000:01: resource 2 [mem 0x380000000000-0x3807ffffffff 64bit pref] Jan 16 21:15:57.120332 kernel: pci_bus 0000:02: resource 0 [io 0x6000-0x6fff] Jan 16 21:15:57.120427 kernel: pci_bus 0000:02: resource 1 [mem 0x84000000-0x841fffff] Jan 16 21:15:57.120520 kernel: pci_bus 0000:02: resource 2 [mem 0x380000000000-0x3807ffffffff 64bit pref] Jan 16 21:15:57.120628 kernel: pci_bus 0000:03: resource 1 [mem 0x83e00000-0x83ffffff] Jan 16 21:15:57.120725 kernel: pci_bus 0000:03: resource 2 [mem 0x380800000000-0x380fffffffff 64bit pref] Jan 16 21:15:57.120821 kernel: pci_bus 0000:04: resource 1 [mem 0x83c00000-0x83dfffff] Jan 16 21:15:57.120912 kernel: pci_bus 0000:04: resource 2 [mem 0x381000000000-0x3817ffffffff 64bit pref] Jan 16 21:15:57.121007 kernel: pci_bus 0000:05: resource 1 [mem 0x83a00000-0x83bfffff] Jan 16 21:15:57.121097 kernel: pci_bus 0000:05: resource 2 [mem 0x381800000000-0x381fffffffff 64bit pref] Jan 16 21:15:57.121211 kernel: pci_bus 0000:06: resource 1 [mem 0x83800000-0x839fffff] Jan 16 21:15:57.121304 kernel: pci_bus 0000:06: resource 2 [mem 0x382000000000-0x3827ffffffff 64bit pref] Jan 16 21:15:57.121400 kernel: pci_bus 0000:07: resource 1 [mem 0x83600000-0x837fffff] Jan 16 21:15:57.121491 kernel: pci_bus 0000:07: resource 2 [mem 0x382800000000-0x382fffffffff 64bit pref] Jan 16 21:15:57.121587 kernel: pci_bus 0000:08: resource 1 [mem 0x83400000-0x835fffff] Jan 16 21:15:57.121763 kernel: pci_bus 0000:08: resource 2 [mem 0x383000000000-0x3837ffffffff 64bit pref] Jan 16 21:15:57.121860 kernel: pci_bus 0000:09: resource 1 [mem 0x83200000-0x833fffff] Jan 16 21:15:57.121951 kernel: pci_bus 0000:09: resource 2 [mem 0x383800000000-0x383fffffffff 64bit pref] Jan 16 21:15:57.122048 kernel: pci_bus 0000:0a: resource 1 [mem 0x83000000-0x831fffff] Jan 16 21:15:57.122138 kernel: pci_bus 0000:0a: resource 2 [mem 0x384000000000-0x3847ffffffff 64bit pref] Jan 16 21:15:57.122237 kernel: pci_bus 0000:0b: resource 1 [mem 0x82e00000-0x82ffffff] Jan 16 21:15:57.122330 kernel: pci_bus 0000:0b: resource 2 [mem 0x384800000000-0x384fffffffff 64bit pref] Jan 16 21:15:57.122429 kernel: pci_bus 0000:0c: resource 1 [mem 0x82c00000-0x82dfffff] Jan 16 21:15:57.122519 kernel: pci_bus 0000:0c: resource 2 [mem 0x385000000000-0x3857ffffffff 64bit pref] Jan 16 21:15:57.123572 kernel: pci_bus 0000:0d: resource 1 [mem 0x82a00000-0x82bfffff] Jan 16 21:15:57.123700 kernel: pci_bus 0000:0d: resource 2 [mem 0x385800000000-0x385fffffffff 64bit pref] Jan 16 21:15:57.123798 kernel: pci_bus 0000:0e: resource 1 [mem 0x82800000-0x829fffff] Jan 16 21:15:57.123889 kernel: pci_bus 0000:0e: resource 2 [mem 0x386000000000-0x3867ffffffff 64bit pref] Jan 16 21:15:57.123988 kernel: pci_bus 0000:0f: resource 1 [mem 0x82600000-0x827fffff] Jan 16 21:15:57.124078 kernel: pci_bus 0000:0f: resource 2 [mem 0x386800000000-0x386fffffffff 64bit pref] Jan 16 21:15:57.124176 kernel: pci_bus 0000:10: resource 1 [mem 0x82400000-0x825fffff] Jan 16 21:15:57.124266 kernel: pci_bus 0000:10: resource 2 [mem 0x387000000000-0x3877ffffffff 64bit pref] Jan 16 21:15:57.124360 kernel: pci_bus 0000:11: resource 1 [mem 0x82200000-0x823fffff] Jan 16 21:15:57.124451 kernel: pci_bus 0000:11: resource 2 [mem 0x387800000000-0x387fffffffff 64bit pref] Jan 16 21:15:57.124549 kernel: pci_bus 0000:12: resource 0 [io 0xf000-0xffff] Jan 16 21:15:57.125801 kernel: pci_bus 0000:12: resource 1 [mem 0x82000000-0x821fffff] Jan 16 21:15:57.125911 kernel: pci_bus 0000:12: resource 2 [mem 0x388000000000-0x3887ffffffff 64bit pref] Jan 16 21:15:57.126014 kernel: pci_bus 0000:13: resource 0 [io 0xe000-0xefff] Jan 16 21:15:57.126105 kernel: pci_bus 0000:13: resource 1 [mem 0x81e00000-0x81ffffff] Jan 16 21:15:57.126195 kernel: pci_bus 0000:13: resource 2 [mem 0x388800000000-0x388fffffffff 64bit pref] Jan 16 21:15:57.126290 kernel: pci_bus 0000:14: resource 0 [io 0xd000-0xdfff] Jan 16 21:15:57.126384 kernel: pci_bus 0000:14: resource 1 [mem 0x81c00000-0x81dfffff] Jan 16 21:15:57.126473 kernel: pci_bus 0000:14: resource 2 [mem 0x389000000000-0x3897ffffffff 64bit pref] Jan 16 21:15:57.126568 kernel: pci_bus 0000:15: resource 0 [io 0xc000-0xcfff] Jan 16 21:15:57.127754 kernel: pci_bus 0000:15: resource 1 [mem 0x81a00000-0x81bfffff] Jan 16 21:15:57.127854 kernel: pci_bus 0000:15: resource 2 [mem 0x389800000000-0x389fffffffff 64bit pref] Jan 16 21:15:57.127954 kernel: pci_bus 0000:16: resource 0 [io 0xb000-0xbfff] Jan 16 21:15:57.128049 kernel: pci_bus 0000:16: resource 1 [mem 0x81800000-0x819fffff] Jan 16 21:15:57.128138 kernel: pci_bus 0000:16: resource 2 [mem 0x38a000000000-0x38a7ffffffff 64bit pref] Jan 16 21:15:57.128235 kernel: pci_bus 0000:17: resource 0 [io 0xa000-0xafff] Jan 16 21:15:57.128326 kernel: pci_bus 0000:17: resource 1 [mem 0x81600000-0x817fffff] Jan 16 21:15:57.128416 kernel: pci_bus 0000:17: resource 2 [mem 0x38a800000000-0x38afffffffff 64bit pref] Jan 16 21:15:57.128514 kernel: pci_bus 0000:18: resource 0 [io 0x9000-0x9fff] Jan 16 21:15:57.129640 kernel: pci_bus 0000:18: resource 1 [mem 0x81400000-0x815fffff] Jan 16 21:15:57.129755 kernel: pci_bus 0000:18: resource 2 [mem 0x38b000000000-0x38b7ffffffff 64bit pref] Jan 16 21:15:57.129856 kernel: pci_bus 0000:19: resource 0 [io 0x8000-0x8fff] Jan 16 21:15:57.129947 kernel: pci_bus 0000:19: resource 1 [mem 0x81200000-0x813fffff] Jan 16 21:15:57.130038 kernel: pci_bus 0000:19: resource 2 [mem 0x38b800000000-0x38bfffffffff 64bit pref] Jan 16 21:15:57.130138 kernel: pci_bus 0000:1a: resource 0 [io 0x5000-0x5fff] Jan 16 21:15:57.130228 kernel: pci_bus 0000:1a: resource 1 [mem 0x81000000-0x811fffff] Jan 16 21:15:57.130317 kernel: pci_bus 0000:1a: resource 2 [mem 0x38c000000000-0x38c7ffffffff 64bit pref] Jan 16 21:15:57.130412 kernel: pci_bus 0000:1b: resource 0 [io 0x4000-0x4fff] Jan 16 21:15:57.130502 kernel: pci_bus 0000:1b: resource 1 [mem 0x80e00000-0x80ffffff] Jan 16 21:15:57.130594 kernel: pci_bus 0000:1b: resource 2 [mem 0x38c800000000-0x38cfffffffff 64bit pref] Jan 16 21:15:57.131117 kernel: pci_bus 0000:1c: resource 0 [io 0x3000-0x3fff] Jan 16 21:15:57.131212 kernel: pci_bus 0000:1c: resource 1 [mem 0x80c00000-0x80dfffff] Jan 16 21:15:57.131303 kernel: pci_bus 0000:1c: resource 2 [mem 0x38d000000000-0x38d7ffffffff 64bit pref] Jan 16 21:15:57.131796 kernel: pci_bus 0000:1d: resource 0 [io 0x2000-0x2fff] Jan 16 21:15:57.131892 kernel: pci_bus 0000:1d: resource 1 [mem 0x80a00000-0x80bfffff] Jan 16 21:15:57.131986 kernel: pci_bus 0000:1d: resource 2 [mem 0x38d800000000-0x38dfffffffff 64bit pref] Jan 16 21:15:57.132080 kernel: pci_bus 0000:1e: resource 0 [io 0x1000-0x1fff] Jan 16 21:15:57.132171 kernel: pci_bus 0000:1e: resource 1 [mem 0x80800000-0x809fffff] Jan 16 21:15:57.132260 kernel: pci_bus 0000:1e: resource 2 [mem 0x38e000000000-0x38e7ffffffff 64bit pref] Jan 16 21:15:57.132272 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Jan 16 21:15:57.132281 kernel: PCI: CLS 0 bytes, default 64 Jan 16 21:15:57.132292 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Jan 16 21:15:57.132301 kernel: software IO TLB: mapped [mem 0x0000000077ede000-0x000000007bede000] (64MB) Jan 16 21:15:57.132310 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Jan 16 21:15:57.132319 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x21134f58f0d, max_idle_ns: 440795217993 ns Jan 16 21:15:57.132328 kernel: Initialise system trusted keyrings Jan 16 21:15:57.132337 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Jan 16 21:15:57.132345 kernel: Key type asymmetric registered Jan 16 21:15:57.132355 kernel: Asymmetric key parser 'x509' registered Jan 16 21:15:57.132364 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Jan 16 21:15:57.132372 kernel: io scheduler mq-deadline registered Jan 16 21:15:57.132381 kernel: io scheduler kyber registered Jan 16 21:15:57.132389 kernel: io scheduler bfq registered Jan 16 21:15:57.132500 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Jan 16 21:15:57.132622 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Jan 16 21:15:57.132736 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Jan 16 21:15:57.132836 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Jan 16 21:15:57.132937 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Jan 16 21:15:57.133034 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Jan 16 21:15:57.133132 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Jan 16 21:15:57.133245 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Jan 16 21:15:57.133348 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Jan 16 21:15:57.133445 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Jan 16 21:15:57.133544 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Jan 16 21:15:57.134340 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Jan 16 21:15:57.134460 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Jan 16 21:15:57.134561 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Jan 16 21:15:57.134674 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Jan 16 21:15:57.135359 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Jan 16 21:15:57.135377 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Jan 16 21:15:57.135488 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 32 Jan 16 21:15:57.135588 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 32 Jan 16 21:15:57.136905 kernel: pcieport 0000:00:03.1: PME: Signaling with IRQ 33 Jan 16 21:15:57.137010 kernel: pcieport 0000:00:03.1: AER: enabled with IRQ 33 Jan 16 21:15:57.137111 kernel: pcieport 0000:00:03.2: PME: Signaling with IRQ 34 Jan 16 21:15:57.137228 kernel: pcieport 0000:00:03.2: AER: enabled with IRQ 34 Jan 16 21:15:57.137328 kernel: pcieport 0000:00:03.3: PME: Signaling with IRQ 35 Jan 16 21:15:57.137424 kernel: pcieport 0000:00:03.3: AER: enabled with IRQ 35 Jan 16 21:15:57.137523 kernel: pcieport 0000:00:03.4: PME: Signaling with IRQ 36 Jan 16 21:15:57.137631 kernel: pcieport 0000:00:03.4: AER: enabled with IRQ 36 Jan 16 21:15:57.137733 kernel: pcieport 0000:00:03.5: PME: Signaling with IRQ 37 Jan 16 21:15:57.137831 kernel: pcieport 0000:00:03.5: AER: enabled with IRQ 37 Jan 16 21:15:57.137929 kernel: pcieport 0000:00:03.6: PME: Signaling with IRQ 38 Jan 16 21:15:57.138026 kernel: pcieport 0000:00:03.6: AER: enabled with IRQ 38 Jan 16 21:15:57.138123 kernel: pcieport 0000:00:03.7: PME: Signaling with IRQ 39 Jan 16 21:15:57.138710 kernel: pcieport 0000:00:03.7: AER: enabled with IRQ 39 Jan 16 21:15:57.138729 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 Jan 16 21:15:57.138834 kernel: pcieport 0000:00:04.0: PME: Signaling with IRQ 40 Jan 16 21:15:57.138932 kernel: pcieport 0000:00:04.0: AER: enabled with IRQ 40 Jan 16 21:15:57.139030 kernel: pcieport 0000:00:04.1: PME: Signaling with IRQ 41 Jan 16 21:15:57.142669 kernel: pcieport 0000:00:04.1: AER: enabled with IRQ 41 Jan 16 21:15:57.142794 kernel: pcieport 0000:00:04.2: PME: Signaling with IRQ 42 Jan 16 21:15:57.142897 kernel: pcieport 0000:00:04.2: AER: enabled with IRQ 42 Jan 16 21:15:57.142998 kernel: pcieport 0000:00:04.3: PME: Signaling with IRQ 43 Jan 16 21:15:57.143094 kernel: pcieport 0000:00:04.3: AER: enabled with IRQ 43 Jan 16 21:15:57.143192 kernel: pcieport 0000:00:04.4: PME: Signaling with IRQ 44 Jan 16 21:15:57.143289 kernel: pcieport 0000:00:04.4: AER: enabled with IRQ 44 Jan 16 21:15:57.143390 kernel: pcieport 0000:00:04.5: PME: Signaling with IRQ 45 Jan 16 21:15:57.143491 kernel: pcieport 0000:00:04.5: AER: enabled with IRQ 45 Jan 16 21:15:57.143590 kernel: pcieport 0000:00:04.6: PME: Signaling with IRQ 46 Jan 16 21:15:57.146563 kernel: pcieport 0000:00:04.6: AER: enabled with IRQ 46 Jan 16 21:15:57.146708 kernel: pcieport 0000:00:04.7: PME: Signaling with IRQ 47 Jan 16 21:15:57.146809 kernel: pcieport 0000:00:04.7: AER: enabled with IRQ 47 Jan 16 21:15:57.146821 kernel: ACPI: \_SB_.GSIF: Enabled at IRQ 21 Jan 16 21:15:57.146918 kernel: pcieport 0000:00:05.0: PME: Signaling with IRQ 48 Jan 16 21:15:57.147020 kernel: pcieport 0000:00:05.0: AER: enabled with IRQ 48 Jan 16 21:15:57.147119 kernel: pcieport 0000:00:05.1: PME: Signaling with IRQ 49 Jan 16 21:15:57.147215 kernel: pcieport 0000:00:05.1: AER: enabled with IRQ 49 Jan 16 21:15:57.147314 kernel: pcieport 0000:00:05.2: PME: Signaling with IRQ 50 Jan 16 21:15:57.147410 kernel: pcieport 0000:00:05.2: AER: enabled with IRQ 50 Jan 16 21:15:57.147507 kernel: pcieport 0000:00:05.3: PME: Signaling with IRQ 51 Jan 16 21:15:57.149125 kernel: pcieport 0000:00:05.3: AER: enabled with IRQ 51 Jan 16 21:15:57.149284 kernel: pcieport 0000:00:05.4: PME: Signaling with IRQ 52 Jan 16 21:15:57.149386 kernel: pcieport 0000:00:05.4: AER: enabled with IRQ 52 Jan 16 21:15:57.149398 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Jan 16 21:15:57.149407 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Jan 16 21:15:57.149416 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Jan 16 21:15:57.149425 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Jan 16 21:15:57.149434 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Jan 16 21:15:57.149446 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Jan 16 21:15:57.149553 kernel: rtc_cmos 00:03: RTC can wake from S4 Jan 16 21:15:57.149566 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Jan 16 21:15:57.149961 kernel: rtc_cmos 00:03: registered as rtc0 Jan 16 21:15:57.150060 kernel: rtc_cmos 00:03: setting system clock to 2026-01-16T21:15:55 UTC (1768598155) Jan 16 21:15:57.150152 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram Jan 16 21:15:57.150167 kernel: intel_pstate: CPU model not supported Jan 16 21:15:57.150176 kernel: efifb: probing for efifb Jan 16 21:15:57.150186 kernel: efifb: framebuffer at 0x80000000, using 4000k, total 4000k Jan 16 21:15:57.150194 kernel: efifb: mode is 1280x800x32, linelength=5120, pages=1 Jan 16 21:15:57.150203 kernel: efifb: scrolling: redraw Jan 16 21:15:57.150211 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Jan 16 21:15:57.150220 kernel: Console: switching to colour frame buffer device 160x50 Jan 16 21:15:57.150230 kernel: fb0: EFI VGA frame buffer device Jan 16 21:15:57.150239 kernel: pstore: Using crash dump compression: deflate Jan 16 21:15:57.150247 kernel: pstore: Registered efi_pstore as persistent store backend Jan 16 21:15:57.150255 kernel: NET: Registered PF_INET6 protocol family Jan 16 21:15:57.150264 kernel: Segment Routing with IPv6 Jan 16 21:15:57.150273 kernel: In-situ OAM (IOAM) with IPv6 Jan 16 21:15:57.150281 kernel: NET: Registered PF_PACKET protocol family Jan 16 21:15:57.150292 kernel: Key type dns_resolver registered Jan 16 21:15:57.150300 kernel: IPI shorthand broadcast: enabled Jan 16 21:15:57.150309 kernel: sched_clock: Marking stable (2524001623, 153145294)->(2775470568, -98323651) Jan 16 21:15:57.150319 kernel: registered taskstats version 1 Jan 16 21:15:57.150327 kernel: Loading compiled-in X.509 certificates Jan 16 21:15:57.150336 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.65-flatcar: a9591db9912320a48a0589d0293fff3e535b90df' Jan 16 21:15:57.150344 kernel: Demotion targets for Node 0: null Jan 16 21:15:57.150354 kernel: Key type .fscrypt registered Jan 16 21:15:57.150363 kernel: Key type fscrypt-provisioning registered Jan 16 21:15:57.150371 kernel: ima: No TPM chip found, activating TPM-bypass! Jan 16 21:15:57.150380 kernel: ima: Allocated hash algorithm: sha1 Jan 16 21:15:57.150389 kernel: ima: No architecture policies found Jan 16 21:15:57.150397 kernel: clk: Disabling unused clocks Jan 16 21:15:57.150406 kernel: Freeing unused kernel image (initmem) memory: 15536K Jan 16 21:15:57.150414 kernel: Write protecting the kernel read-only data: 47104k Jan 16 21:15:57.150425 kernel: Freeing unused kernel image (rodata/data gap) memory: 1124K Jan 16 21:15:57.150433 kernel: Run /init as init process Jan 16 21:15:57.150442 kernel: with arguments: Jan 16 21:15:57.150451 kernel: /init Jan 16 21:15:57.150459 kernel: with environment: Jan 16 21:15:57.150468 kernel: HOME=/ Jan 16 21:15:57.150476 kernel: TERM=linux Jan 16 21:15:57.150486 kernel: SCSI subsystem initialized Jan 16 21:15:57.150495 kernel: libata version 3.00 loaded. Jan 16 21:15:57.150611 kernel: ahci 0000:00:1f.2: version 3.0 Jan 16 21:15:57.150623 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Jan 16 21:15:57.150752 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode Jan 16 21:15:57.150852 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) Jan 16 21:15:57.150950 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Jan 16 21:15:57.151071 kernel: scsi host0: ahci Jan 16 21:15:57.151177 kernel: scsi host1: ahci Jan 16 21:15:57.151299 kernel: scsi host2: ahci Jan 16 21:15:57.151402 kernel: scsi host3: ahci Jan 16 21:15:57.151508 kernel: scsi host4: ahci Jan 16 21:15:57.151630 kernel: scsi host5: ahci Jan 16 21:15:57.151642 kernel: ata1: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380100 irq 55 lpm-pol 1 Jan 16 21:15:57.151651 kernel: ata2: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380180 irq 55 lpm-pol 1 Jan 16 21:15:57.151660 kernel: ata3: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380200 irq 55 lpm-pol 1 Jan 16 21:15:57.151669 kernel: ata4: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380280 irq 55 lpm-pol 1 Jan 16 21:15:57.151678 kernel: ata5: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380300 irq 55 lpm-pol 1 Jan 16 21:15:57.151690 kernel: ata6: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380380 irq 55 lpm-pol 1 Jan 16 21:15:57.151698 kernel: ata3: SATA link down (SStatus 0 SControl 300) Jan 16 21:15:57.151707 kernel: ata4: SATA link down (SStatus 0 SControl 300) Jan 16 21:15:57.151716 kernel: ata2: SATA link down (SStatus 0 SControl 300) Jan 16 21:15:57.151725 kernel: ata1: SATA link down (SStatus 0 SControl 300) Jan 16 21:15:57.151733 kernel: ata5: SATA link down (SStatus 0 SControl 300) Jan 16 21:15:57.151742 kernel: ata6: SATA link down (SStatus 0 SControl 300) Jan 16 21:15:57.151751 kernel: ACPI: bus type USB registered Jan 16 21:15:57.151761 kernel: usbcore: registered new interface driver usbfs Jan 16 21:15:57.151770 kernel: usbcore: registered new interface driver hub Jan 16 21:15:57.151778 kernel: usbcore: registered new device driver usb Jan 16 21:15:57.151886 kernel: uhci_hcd 0000:02:01.0: UHCI Host Controller Jan 16 21:15:57.151993 kernel: uhci_hcd 0000:02:01.0: new USB bus registered, assigned bus number 1 Jan 16 21:15:57.152096 kernel: uhci_hcd 0000:02:01.0: detected 2 ports Jan 16 21:15:57.152199 kernel: uhci_hcd 0000:02:01.0: irq 22, io port 0x00006000 Jan 16 21:15:57.152330 kernel: hub 1-0:1.0: USB hub found Jan 16 21:15:57.152441 kernel: hub 1-0:1.0: 2 ports detected Jan 16 21:15:57.152554 kernel: virtio_blk virtio2: 2/0/0 default/read/poll queues Jan 16 21:15:57.152682 kernel: virtio_blk virtio2: [vda] 104857600 512-byte logical blocks (53.7 GB/50.0 GiB) Jan 16 21:15:57.152697 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Jan 16 21:15:57.152706 kernel: GPT:25804799 != 104857599 Jan 16 21:15:57.152715 kernel: GPT:Alternate GPT header not at the end of the disk. Jan 16 21:15:57.152724 kernel: GPT:25804799 != 104857599 Jan 16 21:15:57.152732 kernel: GPT: Use GNU Parted to correct GPT errors. Jan 16 21:15:57.152741 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Jan 16 21:15:57.152750 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Jan 16 21:15:57.152761 kernel: device-mapper: uevent: version 1.0.3 Jan 16 21:15:57.152770 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Jan 16 21:15:57.152779 kernel: device-mapper: verity: sha256 using shash "sha256-generic" Jan 16 21:15:57.152787 kernel: raid6: avx512x4 gen() 42728 MB/s Jan 16 21:15:57.152796 kernel: raid6: avx512x2 gen() 46894 MB/s Jan 16 21:15:57.152804 kernel: raid6: avx512x1 gen() 44170 MB/s Jan 16 21:15:57.152813 kernel: raid6: avx2x4 gen() 34508 MB/s Jan 16 21:15:57.152824 kernel: raid6: avx2x2 gen() 33902 MB/s Jan 16 21:15:57.152833 kernel: raid6: avx2x1 gen() 30442 MB/s Jan 16 21:15:57.152842 kernel: raid6: using algorithm avx512x2 gen() 46894 MB/s Jan 16 21:15:57.152851 kernel: raid6: .... xor() 26772 MB/s, rmw enabled Jan 16 21:15:57.152863 kernel: raid6: using avx512x2 recovery algorithm Jan 16 21:15:57.152872 kernel: xor: automatically using best checksumming function avx Jan 16 21:15:57.152881 kernel: Btrfs loaded, zoned=no, fsverity=no Jan 16 21:15:57.152892 kernel: BTRFS: device fsid a5f82c06-1ff1-43b3-a650-214802f1359b devid 1 transid 35 /dev/mapper/usr (253:0) scanned by mount (204) Jan 16 21:15:57.152903 kernel: BTRFS info (device dm-0): first mount of filesystem a5f82c06-1ff1-43b3-a650-214802f1359b Jan 16 21:15:57.152912 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Jan 16 21:15:57.153039 kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd Jan 16 21:15:57.153053 kernel: BTRFS info (device dm-0): disabling log replay at mount time Jan 16 21:15:57.153062 kernel: BTRFS info (device dm-0): enabling free space tree Jan 16 21:15:57.153073 kernel: loop: module loaded Jan 16 21:15:57.153082 kernel: loop0: detected capacity change from 0 to 100536 Jan 16 21:15:57.153091 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Jan 16 21:15:57.153102 systemd[1]: Successfully made /usr/ read-only. Jan 16 21:15:57.153261 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jan 16 21:15:57.153271 systemd[1]: Detected virtualization kvm. Jan 16 21:15:57.153283 systemd[1]: Detected architecture x86-64. Jan 16 21:15:57.153292 systemd[1]: Running in initrd. Jan 16 21:15:57.153301 systemd[1]: No hostname configured, using default hostname. Jan 16 21:15:57.153311 systemd[1]: Hostname set to . Jan 16 21:15:57.153320 systemd[1]: Initializing machine ID from SMBIOS/DMI UUID. Jan 16 21:15:57.153329 systemd[1]: Queued start job for default target initrd.target. Jan 16 21:15:57.153339 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Jan 16 21:15:57.153350 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 16 21:15:57.153360 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 16 21:15:57.153369 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Jan 16 21:15:57.153379 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 16 21:15:57.153389 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Jan 16 21:15:57.153400 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Jan 16 21:15:57.153410 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 16 21:15:57.153420 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 16 21:15:57.153429 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Jan 16 21:15:57.153438 systemd[1]: Reached target paths.target - Path Units. Jan 16 21:15:57.153447 systemd[1]: Reached target slices.target - Slice Units. Jan 16 21:15:57.153456 systemd[1]: Reached target swap.target - Swaps. Jan 16 21:15:57.153467 systemd[1]: Reached target timers.target - Timer Units. Jan 16 21:15:57.153477 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Jan 16 21:15:57.153486 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 16 21:15:57.153495 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Jan 16 21:15:57.153505 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Jan 16 21:15:57.153514 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Jan 16 21:15:57.153524 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 16 21:15:57.153535 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 16 21:15:57.153544 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 16 21:15:57.153553 systemd[1]: Reached target sockets.target - Socket Units. Jan 16 21:15:57.153563 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Jan 16 21:15:57.153572 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Jan 16 21:15:57.153581 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 16 21:15:57.153592 systemd[1]: Finished network-cleanup.service - Network Cleanup. Jan 16 21:15:57.153626 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Jan 16 21:15:57.153635 systemd[1]: Starting systemd-fsck-usr.service... Jan 16 21:15:57.153645 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 16 21:15:57.153654 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 16 21:15:57.153667 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 16 21:15:57.153676 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Jan 16 21:15:57.153685 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 16 21:15:57.153695 systemd[1]: Finished systemd-fsck-usr.service. Jan 16 21:15:57.153730 systemd-journald[341]: Collecting audit messages is enabled. Jan 16 21:15:57.153756 kernel: audit: type=1130 audit(1768598157.032:2): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:15:57.153766 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 16 21:15:57.153775 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 16 21:15:57.153787 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Jan 16 21:15:57.153796 kernel: audit: type=1130 audit(1768598157.074:3): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:15:57.153806 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 16 21:15:57.153815 kernel: Bridge firewalling registered Jan 16 21:15:57.153824 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 16 21:15:57.153834 kernel: audit: type=1130 audit(1768598157.091:4): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:15:57.153843 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 16 21:15:57.153855 kernel: audit: type=1130 audit(1768598157.109:5): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:15:57.153864 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 16 21:15:57.153873 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 16 21:15:57.153883 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 16 21:15:57.153892 kernel: audit: type=1130 audit(1768598157.138:6): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:15:57.153901 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 16 21:15:57.153913 kernel: audit: type=1334 audit(1768598157.138:7): prog-id=6 op=LOAD Jan 16 21:15:57.153921 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 16 21:15:57.153932 systemd-journald[341]: Journal started Jan 16 21:15:57.153952 systemd-journald[341]: Runtime Journal (/run/log/journal/d25a18b3c144475eaaff4d6ad7796358) is 8M, max 77.9M, 69.9M free. Jan 16 21:15:57.032000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:15:57.074000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:15:57.091000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:15:57.109000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:15:57.138000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:15:57.138000 audit: BPF prog-id=6 op=LOAD Jan 16 21:15:57.087678 systemd-modules-load[343]: Inserted module 'br_netfilter' Jan 16 21:15:57.155000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:15:57.161360 kernel: audit: type=1130 audit(1768598157.155:8): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:15:57.161402 systemd[1]: Started systemd-journald.service - Journal Service. Jan 16 21:15:57.161000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:15:57.162533 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 16 21:15:57.168609 kernel: audit: type=1130 audit(1768598157.161:9): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:15:57.168643 kernel: audit: type=1130 audit(1768598157.166:10): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:15:57.166000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:15:57.172074 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Jan 16 21:15:57.180744 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 16 21:15:57.197109 dracut-cmdline[378]: dracut-109 Jan 16 21:15:57.200668 systemd-tmpfiles[379]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Jan 16 21:15:57.204609 dracut-cmdline[378]: Using kernel command line parameters: SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack verity.usrhash=e880b5400e832e1de59b993d9ba6b86a9089175f10b4985da8b7b47cc8c74099 Jan 16 21:15:57.207067 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 16 21:15:57.207000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:15:57.212545 systemd-resolved[365]: Positive Trust Anchors: Jan 16 21:15:57.212556 systemd-resolved[365]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 16 21:15:57.212560 systemd-resolved[365]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Jan 16 21:15:57.212731 systemd-resolved[365]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 16 21:15:57.237268 systemd-resolved[365]: Defaulting to hostname 'linux'. Jan 16 21:15:57.238852 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 16 21:15:57.239000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:15:57.240060 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 16 21:15:57.299706 kernel: Loading iSCSI transport class v2.0-870. Jan 16 21:15:57.319626 kernel: iscsi: registered transport (tcp) Jan 16 21:15:57.344934 kernel: iscsi: registered transport (qla4xxx) Jan 16 21:15:57.345000 kernel: QLogic iSCSI HBA Driver Jan 16 21:15:57.371759 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 16 21:15:57.388564 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 16 21:15:57.389000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:15:57.391361 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 16 21:15:57.428756 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Jan 16 21:15:57.428000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:15:57.430725 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Jan 16 21:15:57.431877 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Jan 16 21:15:57.463304 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Jan 16 21:15:57.463000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:15:57.464000 audit: BPF prog-id=7 op=LOAD Jan 16 21:15:57.464000 audit: BPF prog-id=8 op=LOAD Jan 16 21:15:57.465584 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 16 21:15:57.491459 systemd-udevd[609]: Using default interface naming scheme 'v257'. Jan 16 21:15:57.500586 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 16 21:15:57.502000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:15:57.503923 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Jan 16 21:15:57.527633 dracut-pre-trigger[689]: rd.md=0: removing MD RAID activation Jan 16 21:15:57.530666 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 16 21:15:57.530000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:15:57.531000 audit: BPF prog-id=9 op=LOAD Jan 16 21:15:57.534864 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 16 21:15:57.557724 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Jan 16 21:15:57.558000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:15:57.561736 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 16 21:15:57.578385 systemd-networkd[727]: lo: Link UP Jan 16 21:15:57.578391 systemd-networkd[727]: lo: Gained carrier Jan 16 21:15:57.581000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:15:57.581861 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 16 21:15:57.582429 systemd[1]: Reached target network.target - Network. Jan 16 21:15:57.650517 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 16 21:15:57.650000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:15:57.652743 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Jan 16 21:15:57.732493 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Jan 16 21:15:57.754314 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Jan 16 21:15:57.768054 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Jan 16 21:15:57.770724 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Jan 16 21:15:57.780606 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Jan 16 21:15:57.799682 disk-uuid[787]: Primary Header is updated. Jan 16 21:15:57.799682 disk-uuid[787]: Secondary Entries is updated. Jan 16 21:15:57.799682 disk-uuid[787]: Secondary Header is updated. Jan 16 21:15:57.824622 kernel: hid: raw HID events driver (C) Jiri Kosina Jan 16 21:15:57.872333 kernel: cryptd: max_cpu_qlen set to 1000 Jan 16 21:15:57.885022 kernel: usbcore: registered new interface driver usbhid Jan 16 21:15:57.885094 kernel: usbhid: USB HID core driver Jan 16 21:15:57.888240 systemd-networkd[727]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 16 21:15:57.889785 systemd-networkd[727]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 16 21:15:57.890566 systemd-networkd[727]: eth0: Link UP Jan 16 21:15:57.890731 systemd-networkd[727]: eth0: Gained carrier Jan 16 21:15:57.890745 systemd-networkd[727]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 16 21:15:57.899618 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input2 Jan 16 21:15:57.900679 systemd-networkd[727]: eth0: DHCPv4 address 10.0.3.156/25, gateway 10.0.3.129 acquired from 10.0.3.129 Jan 16 21:15:57.914662 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.0/0000:01:00.0/0000:02:01.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input3 Jan 16 21:15:57.923350 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 16 21:15:57.923964 kernel: AES CTR mode by8 optimization enabled Jan 16 21:15:57.923402 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 16 21:15:57.924000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:15:57.925483 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jan 16 21:15:57.929814 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:01.0-1/input0 Jan 16 21:15:57.928310 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 16 21:15:57.967891 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 16 21:15:57.967000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:15:57.993478 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Jan 16 21:15:57.994000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:15:57.995369 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Jan 16 21:15:57.996370 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 16 21:15:57.996838 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 16 21:15:57.998583 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Jan 16 21:15:58.018848 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Jan 16 21:15:58.018000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:15:58.851853 disk-uuid[788]: Warning: The kernel is still using the old partition table. Jan 16 21:15:58.851853 disk-uuid[788]: The new table will be used at the next reboot or after you Jan 16 21:15:58.851853 disk-uuid[788]: run partprobe(8) or kpartx(8) Jan 16 21:15:58.851853 disk-uuid[788]: The operation has completed successfully. Jan 16 21:15:58.856968 systemd[1]: disk-uuid.service: Deactivated successfully. Jan 16 21:15:58.868664 kernel: kauditd_printk_skb: 17 callbacks suppressed Jan 16 21:15:58.868701 kernel: audit: type=1130 audit(1768598158.856:28): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:15:58.868723 kernel: audit: type=1131 audit(1768598158.856:29): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:15:58.856000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:15:58.856000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:15:58.857065 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Jan 16 21:15:58.859708 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Jan 16 21:15:58.917645 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (907) Jan 16 21:15:58.920710 kernel: BTRFS info (device vda6): first mount of filesystem 984b7cbf-e15c-4ac8-8ab0-1fb2c55516eb Jan 16 21:15:58.920754 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Jan 16 21:15:58.930067 kernel: BTRFS info (device vda6): turning on async discard Jan 16 21:15:58.930133 kernel: BTRFS info (device vda6): enabling free space tree Jan 16 21:15:58.936620 kernel: BTRFS info (device vda6): last unmount of filesystem 984b7cbf-e15c-4ac8-8ab0-1fb2c55516eb Jan 16 21:15:58.937157 systemd[1]: Finished ignition-setup.service - Ignition (setup). Jan 16 21:15:58.941740 kernel: audit: type=1130 audit(1768598158.937:30): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:15:58.937000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:15:58.940726 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Jan 16 21:15:59.155232 ignition[926]: Ignition 2.24.0 Jan 16 21:15:59.156014 ignition[926]: Stage: fetch-offline Jan 16 21:15:59.156066 ignition[926]: no configs at "/usr/lib/ignition/base.d" Jan 16 21:15:59.157583 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Jan 16 21:15:59.156076 ignition[926]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 16 21:15:59.163688 kernel: audit: type=1130 audit(1768598159.158:31): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:15:59.158000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:15:59.160177 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Jan 16 21:15:59.156160 ignition[926]: parsed url from cmdline: "" Jan 16 21:15:59.156163 ignition[926]: no config URL provided Jan 16 21:15:59.156167 ignition[926]: reading system config file "/usr/lib/ignition/user.ign" Jan 16 21:15:59.156174 ignition[926]: no config at "/usr/lib/ignition/user.ign" Jan 16 21:15:59.156179 ignition[926]: failed to fetch config: resource requires networking Jan 16 21:15:59.156343 ignition[926]: Ignition finished successfully Jan 16 21:15:59.183054 ignition[933]: Ignition 2.24.0 Jan 16 21:15:59.183782 ignition[933]: Stage: fetch Jan 16 21:15:59.183938 ignition[933]: no configs at "/usr/lib/ignition/base.d" Jan 16 21:15:59.183946 ignition[933]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 16 21:15:59.184019 ignition[933]: parsed url from cmdline: "" Jan 16 21:15:59.184023 ignition[933]: no config URL provided Jan 16 21:15:59.184027 ignition[933]: reading system config file "/usr/lib/ignition/user.ign" Jan 16 21:15:59.184033 ignition[933]: no config at "/usr/lib/ignition/user.ign" Jan 16 21:15:59.184115 ignition[933]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Jan 16 21:15:59.184449 ignition[933]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Jan 16 21:15:59.184474 ignition[933]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Jan 16 21:15:59.227790 systemd-networkd[727]: eth0: Gained IPv6LL Jan 16 21:15:59.704770 ignition[933]: GET result: OK Jan 16 21:15:59.704916 ignition[933]: parsing config with SHA512: 55c17aa7337bebd1c9ad0599d3dabf8969f9113cc8e3ae137b8f31b014b2a43d0eb4a2e4855325d90b5922a1d67ed5b6736cfb076f305ed7d696178f6464fa52 Jan 16 21:15:59.715276 unknown[933]: fetched base config from "system" Jan 16 21:15:59.715296 unknown[933]: fetched base config from "system" Jan 16 21:15:59.715307 unknown[933]: fetched user config from "openstack" Jan 16 21:15:59.717898 ignition[933]: fetch: fetch complete Jan 16 21:15:59.717904 ignition[933]: fetch: fetch passed Jan 16 21:15:59.720366 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Jan 16 21:15:59.729955 kernel: audit: type=1130 audit(1768598159.721:32): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:15:59.721000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:15:59.717968 ignition[933]: Ignition finished successfully Jan 16 21:15:59.724836 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Jan 16 21:15:59.754065 ignition[939]: Ignition 2.24.0 Jan 16 21:15:59.754076 ignition[939]: Stage: kargs Jan 16 21:15:59.754210 ignition[939]: no configs at "/usr/lib/ignition/base.d" Jan 16 21:15:59.764431 kernel: audit: type=1130 audit(1768598159.756:33): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:15:59.756000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:15:59.756331 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Jan 16 21:15:59.754218 ignition[939]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 16 21:15:59.758714 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Jan 16 21:15:59.754830 ignition[939]: kargs: kargs passed Jan 16 21:15:59.754868 ignition[939]: Ignition finished successfully Jan 16 21:15:59.788537 ignition[945]: Ignition 2.24.0 Jan 16 21:15:59.789572 ignition[945]: Stage: disks Jan 16 21:15:59.789836 ignition[945]: no configs at "/usr/lib/ignition/base.d" Jan 16 21:15:59.789850 ignition[945]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 16 21:15:59.790929 ignition[945]: disks: disks passed Jan 16 21:15:59.790993 ignition[945]: Ignition finished successfully Jan 16 21:15:59.792971 systemd[1]: Finished ignition-disks.service - Ignition (disks). Jan 16 21:15:59.796750 kernel: audit: type=1130 audit(1768598159.793:34): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:15:59.793000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:15:59.794577 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Jan 16 21:15:59.797383 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Jan 16 21:15:59.798231 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 16 21:15:59.799196 systemd[1]: Reached target sysinit.target - System Initialization. Jan 16 21:15:59.800106 systemd[1]: Reached target basic.target - Basic System. Jan 16 21:15:59.802505 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Jan 16 21:15:59.854580 systemd-fsck[953]: ROOT: clean, 15/1631200 files, 112378/1617920 blocks Jan 16 21:15:59.857454 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Jan 16 21:15:59.862345 kernel: audit: type=1130 audit(1768598159.857:35): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:15:59.857000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:15:59.859739 systemd[1]: Mounting sysroot.mount - /sysroot... Jan 16 21:16:00.025620 kernel: EXT4-fs (vda9): mounted filesystem ec5ae8d3-548b-4a34-bd68-b1a953fcffb6 r/w with ordered data mode. Quota mode: none. Jan 16 21:16:00.025986 systemd[1]: Mounted sysroot.mount - /sysroot. Jan 16 21:16:00.027375 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Jan 16 21:16:00.031069 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 16 21:16:00.032880 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Jan 16 21:16:00.034900 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Jan 16 21:16:00.035733 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Jan 16 21:16:00.036166 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Jan 16 21:16:00.036195 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Jan 16 21:16:00.051311 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Jan 16 21:16:00.054054 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Jan 16 21:16:00.074621 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (961) Jan 16 21:16:00.078731 kernel: BTRFS info (device vda6): first mount of filesystem 984b7cbf-e15c-4ac8-8ab0-1fb2c55516eb Jan 16 21:16:00.078784 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Jan 16 21:16:00.090282 kernel: BTRFS info (device vda6): turning on async discard Jan 16 21:16:00.090346 kernel: BTRFS info (device vda6): enabling free space tree Jan 16 21:16:00.092341 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 16 21:16:00.139628 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 16 21:16:00.283728 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Jan 16 21:16:00.287914 kernel: audit: type=1130 audit(1768598160.283:36): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:00.283000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:00.286685 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Jan 16 21:16:00.290710 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Jan 16 21:16:00.302839 systemd[1]: sysroot-oem.mount: Deactivated successfully. Jan 16 21:16:00.304700 kernel: BTRFS info (device vda6): last unmount of filesystem 984b7cbf-e15c-4ac8-8ab0-1fb2c55516eb Jan 16 21:16:00.326656 ignition[1061]: INFO : Ignition 2.24.0 Jan 16 21:16:00.328111 ignition[1061]: INFO : Stage: mount Jan 16 21:16:00.328111 ignition[1061]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 16 21:16:00.328111 ignition[1061]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 16 21:16:00.330614 ignition[1061]: INFO : mount: mount passed Jan 16 21:16:00.330614 ignition[1061]: INFO : Ignition finished successfully Jan 16 21:16:00.335771 kernel: audit: type=1130 audit(1768598160.331:37): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:00.331000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:00.331234 systemd[1]: Finished ignition-mount.service - Ignition (mount). Jan 16 21:16:00.339321 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Jan 16 21:16:00.339000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:01.210631 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 16 21:16:03.215629 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 16 21:16:07.224620 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 16 21:16:07.231926 coreos-metadata[963]: Jan 16 21:16:07.231 WARN failed to locate config-drive, using the metadata service API instead Jan 16 21:16:07.246454 coreos-metadata[963]: Jan 16 21:16:07.246 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Jan 16 21:16:07.412541 coreos-metadata[963]: Jan 16 21:16:07.412 INFO Fetch successful Jan 16 21:16:07.413220 coreos-metadata[963]: Jan 16 21:16:07.412 INFO wrote hostname ci-4580-0-0-p-be73a47b79 to /sysroot/etc/hostname Jan 16 21:16:07.414787 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Jan 16 21:16:07.419696 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 16 21:16:07.419721 kernel: audit: type=1130 audit(1768598167.414:39): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:07.414000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:07.414912 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Jan 16 21:16:07.436334 kernel: audit: type=1131 audit(1768598167.414:40): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:07.414000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:07.416707 systemd[1]: Starting ignition-files.service - Ignition (files)... Jan 16 21:16:07.449353 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 16 21:16:07.503623 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (1078) Jan 16 21:16:07.507615 kernel: BTRFS info (device vda6): first mount of filesystem 984b7cbf-e15c-4ac8-8ab0-1fb2c55516eb Jan 16 21:16:07.507668 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Jan 16 21:16:07.521807 kernel: BTRFS info (device vda6): turning on async discard Jan 16 21:16:07.521884 kernel: BTRFS info (device vda6): enabling free space tree Jan 16 21:16:07.523422 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 16 21:16:07.546663 ignition[1096]: INFO : Ignition 2.24.0 Jan 16 21:16:07.546663 ignition[1096]: INFO : Stage: files Jan 16 21:16:07.547914 ignition[1096]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 16 21:16:07.547914 ignition[1096]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 16 21:16:07.547914 ignition[1096]: DEBUG : files: compiled without relabeling support, skipping Jan 16 21:16:07.549234 ignition[1096]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Jan 16 21:16:07.549234 ignition[1096]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Jan 16 21:16:07.557063 ignition[1096]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Jan 16 21:16:07.557758 ignition[1096]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Jan 16 21:16:07.558381 ignition[1096]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Jan 16 21:16:07.557953 unknown[1096]: wrote ssh authorized keys file for user: core Jan 16 21:16:07.562285 ignition[1096]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Jan 16 21:16:07.562285 ignition[1096]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-amd64.tar.gz: attempt #1 Jan 16 21:16:07.618931 ignition[1096]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Jan 16 21:16:07.737371 ignition[1096]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Jan 16 21:16:07.738490 ignition[1096]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Jan 16 21:16:07.738490 ignition[1096]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Jan 16 21:16:07.738490 ignition[1096]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Jan 16 21:16:07.738490 ignition[1096]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Jan 16 21:16:07.738490 ignition[1096]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 16 21:16:07.738490 ignition[1096]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 16 21:16:07.738490 ignition[1096]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 16 21:16:07.738490 ignition[1096]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 16 21:16:07.741980 ignition[1096]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Jan 16 21:16:07.741980 ignition[1096]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Jan 16 21:16:07.741980 ignition[1096]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Jan 16 21:16:07.743451 ignition[1096]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Jan 16 21:16:07.744340 ignition[1096]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Jan 16 21:16:07.744340 ignition[1096]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.32.4-x86-64.raw: attempt #1 Jan 16 21:16:08.044559 ignition[1096]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Jan 16 21:16:09.134056 ignition[1096]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Jan 16 21:16:09.134056 ignition[1096]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Jan 16 21:16:09.138069 ignition[1096]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 16 21:16:09.142852 ignition[1096]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 16 21:16:09.142852 ignition[1096]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Jan 16 21:16:09.142852 ignition[1096]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Jan 16 21:16:09.146721 ignition[1096]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Jan 16 21:16:09.146721 ignition[1096]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Jan 16 21:16:09.146721 ignition[1096]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Jan 16 21:16:09.146721 ignition[1096]: INFO : files: files passed Jan 16 21:16:09.146721 ignition[1096]: INFO : Ignition finished successfully Jan 16 21:16:09.154110 kernel: audit: type=1130 audit(1768598169.146:41): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:09.146000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:09.146615 systemd[1]: Finished ignition-files.service - Ignition (files). Jan 16 21:16:09.149718 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Jan 16 21:16:09.153723 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Jan 16 21:16:09.180386 systemd[1]: ignition-quench.service: Deactivated successfully. Jan 16 21:16:09.180703 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Jan 16 21:16:09.181000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:09.181000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:09.187044 kernel: audit: type=1130 audit(1768598169.181:42): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:09.187079 kernel: audit: type=1131 audit(1768598169.181:43): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:09.187368 initrd-setup-root-after-ignition[1128]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 16 21:16:09.190589 initrd-setup-root-after-ignition[1128]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Jan 16 21:16:09.191086 initrd-setup-root-after-ignition[1132]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 16 21:16:09.192462 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 16 21:16:09.192000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:09.193888 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Jan 16 21:16:09.197680 kernel: audit: type=1130 audit(1768598169.192:44): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:09.198722 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Jan 16 21:16:09.243539 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Jan 16 21:16:09.243694 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Jan 16 21:16:09.252049 kernel: audit: type=1130 audit(1768598169.244:45): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:09.252077 kernel: audit: type=1131 audit(1768598169.244:46): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:09.244000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:09.244000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:09.245104 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Jan 16 21:16:09.252557 systemd[1]: Reached target initrd.target - Initrd Default Target. Jan 16 21:16:09.253681 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Jan 16 21:16:09.254717 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Jan 16 21:16:09.278346 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 16 21:16:09.282959 kernel: audit: type=1130 audit(1768598169.278:47): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:09.278000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:09.280674 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Jan 16 21:16:09.306201 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Jan 16 21:16:09.307109 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Jan 16 21:16:09.308393 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 16 21:16:09.308986 systemd[1]: Stopped target timers.target - Timer Units. Jan 16 21:16:09.314115 kernel: audit: type=1131 audit(1768598169.309:48): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:09.309000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:09.309623 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Jan 16 21:16:09.309730 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 16 21:16:09.314184 systemd[1]: Stopped target initrd.target - Initrd Default Target. Jan 16 21:16:09.314787 systemd[1]: Stopped target basic.target - Basic System. Jan 16 21:16:09.315772 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Jan 16 21:16:09.316709 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Jan 16 21:16:09.317635 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Jan 16 21:16:09.318540 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Jan 16 21:16:09.319438 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Jan 16 21:16:09.320328 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Jan 16 21:16:09.321234 systemd[1]: Stopped target sysinit.target - System Initialization. Jan 16 21:16:09.322126 systemd[1]: Stopped target local-fs.target - Local File Systems. Jan 16 21:16:09.324000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:09.323009 systemd[1]: Stopped target swap.target - Swaps. Jan 16 21:16:09.323841 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Jan 16 21:16:09.323946 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Jan 16 21:16:09.325166 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Jan 16 21:16:09.327000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:09.325652 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 16 21:16:09.326412 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Jan 16 21:16:09.328000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:09.326480 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 16 21:16:09.329000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:09.327242 systemd[1]: dracut-initqueue.service: Deactivated successfully. Jan 16 21:16:09.327337 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Jan 16 21:16:09.328633 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Jan 16 21:16:09.328737 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 16 21:16:09.329563 systemd[1]: ignition-files.service: Deactivated successfully. Jan 16 21:16:09.329660 systemd[1]: Stopped ignition-files.service - Ignition (files). Jan 16 21:16:09.332747 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Jan 16 21:16:09.333000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:09.333196 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Jan 16 21:16:09.333299 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Jan 16 21:16:09.336000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:09.335770 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Jan 16 21:16:09.336170 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Jan 16 21:16:09.336274 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 16 21:16:09.337000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:09.337665 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Jan 16 21:16:09.339000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:09.337763 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Jan 16 21:16:09.338357 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Jan 16 21:16:09.338441 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Jan 16 21:16:09.344288 systemd[1]: initrd-cleanup.service: Deactivated successfully. Jan 16 21:16:09.344377 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Jan 16 21:16:09.347000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:09.347000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:09.356728 ignition[1152]: INFO : Ignition 2.24.0 Jan 16 21:16:09.358151 ignition[1152]: INFO : Stage: umount Jan 16 21:16:09.358151 ignition[1152]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 16 21:16:09.358151 ignition[1152]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 16 21:16:09.360294 ignition[1152]: INFO : umount: umount passed Jan 16 21:16:09.360294 ignition[1152]: INFO : Ignition finished successfully Jan 16 21:16:09.360000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:09.360512 systemd[1]: ignition-mount.service: Deactivated successfully. Jan 16 21:16:09.360657 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Jan 16 21:16:09.362000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:09.361496 systemd[1]: ignition-disks.service: Deactivated successfully. Jan 16 21:16:09.362000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:09.361567 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Jan 16 21:16:09.363000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:09.362965 systemd[1]: ignition-kargs.service: Deactivated successfully. Jan 16 21:16:09.363011 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Jan 16 21:16:09.365000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:09.363687 systemd[1]: ignition-fetch.service: Deactivated successfully. Jan 16 21:16:09.363725 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Jan 16 21:16:09.364919 systemd[1]: Stopped target network.target - Network. Jan 16 21:16:09.365414 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Jan 16 21:16:09.365457 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Jan 16 21:16:09.366140 systemd[1]: Stopped target paths.target - Path Units. Jan 16 21:16:09.366761 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Jan 16 21:16:09.370687 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 16 21:16:09.371072 systemd[1]: Stopped target slices.target - Slice Units. Jan 16 21:16:09.371783 systemd[1]: Stopped target sockets.target - Socket Units. Jan 16 21:16:09.372500 systemd[1]: iscsid.socket: Deactivated successfully. Jan 16 21:16:09.372541 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Jan 16 21:16:09.373196 systemd[1]: iscsiuio.socket: Deactivated successfully. Jan 16 21:16:09.373225 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 16 21:16:09.374000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:09.373865 systemd[1]: systemd-journald-audit.socket: Deactivated successfully. Jan 16 21:16:09.375000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup-pre comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:09.373889 systemd[1]: Closed systemd-journald-audit.socket - Journal Audit Socket. Jan 16 21:16:09.374512 systemd[1]: ignition-setup.service: Deactivated successfully. Jan 16 21:16:09.374556 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Jan 16 21:16:09.375204 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Jan 16 21:16:09.375237 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Jan 16 21:16:09.375973 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Jan 16 21:16:09.376826 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Jan 16 21:16:09.380000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:09.379643 systemd[1]: sysroot-boot.mount: Deactivated successfully. Jan 16 21:16:09.380205 systemd[1]: sysroot-boot.service: Deactivated successfully. Jan 16 21:16:09.381000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:09.380287 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Jan 16 21:16:09.381326 systemd[1]: initrd-setup-root.service: Deactivated successfully. Jan 16 21:16:09.381404 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Jan 16 21:16:09.384999 systemd[1]: systemd-resolved.service: Deactivated successfully. Jan 16 21:16:09.385143 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Jan 16 21:16:09.385000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:09.387795 systemd[1]: systemd-networkd.service: Deactivated successfully. Jan 16 21:16:09.387900 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Jan 16 21:16:09.387000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:09.389947 systemd[1]: Stopped target network-pre.target - Preparation for Network. Jan 16 21:16:09.390000 audit: BPF prog-id=6 op=UNLOAD Jan 16 21:16:09.390348 systemd[1]: systemd-networkd.socket: Deactivated successfully. Jan 16 21:16:09.390385 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Jan 16 21:16:09.391000 audit: BPF prog-id=9 op=UNLOAD Jan 16 21:16:09.391784 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Jan 16 21:16:09.393645 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Jan 16 21:16:09.393000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:09.393697 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 16 21:16:09.394000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:09.394406 systemd[1]: systemd-sysctl.service: Deactivated successfully. Jan 16 21:16:09.396000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:09.394442 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Jan 16 21:16:09.395072 systemd[1]: systemd-modules-load.service: Deactivated successfully. Jan 16 21:16:09.395107 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Jan 16 21:16:09.396804 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 16 21:16:09.410686 systemd[1]: systemd-udevd.service: Deactivated successfully. Jan 16 21:16:09.410818 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 16 21:16:09.411000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:09.413010 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Jan 16 21:16:09.413085 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Jan 16 21:16:09.413737 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Jan 16 21:16:09.413766 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Jan 16 21:16:09.415558 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Jan 16 21:16:09.415000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:09.415000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:09.415620 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Jan 16 21:16:09.416000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:09.416115 systemd[1]: dracut-cmdline.service: Deactivated successfully. Jan 16 21:16:09.416155 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Jan 16 21:16:09.416561 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 16 21:16:09.416796 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 16 21:16:09.420730 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Jan 16 21:16:09.421127 systemd[1]: systemd-network-generator.service: Deactivated successfully. Jan 16 21:16:09.421000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:09.421178 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Jan 16 21:16:09.422000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:09.422180 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Jan 16 21:16:09.424000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:09.422227 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 16 21:16:09.422879 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 16 21:16:09.425000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=network-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:09.422916 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 16 21:16:09.425410 systemd[1]: network-cleanup.service: Deactivated successfully. Jan 16 21:16:09.425507 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Jan 16 21:16:09.438956 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Jan 16 21:16:09.439061 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Jan 16 21:16:09.439000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:09.439000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:09.440186 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Jan 16 21:16:09.441490 systemd[1]: Starting initrd-switch-root.service - Switch Root... Jan 16 21:16:09.461529 systemd[1]: Switching root. Jan 16 21:16:09.500909 systemd-journald[341]: Journal stopped Jan 16 21:16:10.837479 systemd-journald[341]: Received SIGTERM from PID 1 (systemd). Jan 16 21:16:10.837557 kernel: SELinux: policy capability network_peer_controls=1 Jan 16 21:16:10.837572 kernel: SELinux: policy capability open_perms=1 Jan 16 21:16:10.837587 kernel: SELinux: policy capability extended_socket_class=1 Jan 16 21:16:10.837613 kernel: SELinux: policy capability always_check_network=0 Jan 16 21:16:10.837627 kernel: SELinux: policy capability cgroup_seclabel=1 Jan 16 21:16:10.837638 kernel: SELinux: policy capability nnp_nosuid_transition=1 Jan 16 21:16:10.837649 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Jan 16 21:16:10.837663 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Jan 16 21:16:10.837674 kernel: SELinux: policy capability userspace_initial_context=0 Jan 16 21:16:10.837688 systemd[1]: Successfully loaded SELinux policy in 78.615ms. Jan 16 21:16:10.837709 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 5.388ms. Jan 16 21:16:10.837722 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jan 16 21:16:10.837734 systemd[1]: Detected virtualization kvm. Jan 16 21:16:10.837749 systemd[1]: Detected architecture x86-64. Jan 16 21:16:10.837764 systemd[1]: Detected first boot. Jan 16 21:16:10.837776 systemd[1]: Hostname set to . Jan 16 21:16:10.837790 systemd[1]: Initializing machine ID from SMBIOS/DMI UUID. Jan 16 21:16:10.837806 zram_generator::config[1195]: No configuration found. Jan 16 21:16:10.837824 kernel: Guest personality initialized and is inactive Jan 16 21:16:10.837835 kernel: VMCI host device registered (name=vmci, major=10, minor=258) Jan 16 21:16:10.837845 kernel: Initialized host personality Jan 16 21:16:10.837856 kernel: NET: Registered PF_VSOCK protocol family Jan 16 21:16:10.837868 systemd[1]: Populated /etc with preset unit settings. Jan 16 21:16:10.837881 systemd[1]: initrd-switch-root.service: Deactivated successfully. Jan 16 21:16:10.837893 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Jan 16 21:16:10.837904 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Jan 16 21:16:10.837919 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Jan 16 21:16:10.837930 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Jan 16 21:16:10.837942 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Jan 16 21:16:10.837957 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Jan 16 21:16:10.837972 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Jan 16 21:16:10.837983 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Jan 16 21:16:10.837994 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Jan 16 21:16:10.838005 systemd[1]: Created slice user.slice - User and Session Slice. Jan 16 21:16:10.838016 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 16 21:16:10.838028 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 16 21:16:10.838039 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Jan 16 21:16:10.838052 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Jan 16 21:16:10.838065 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Jan 16 21:16:10.838076 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 16 21:16:10.838087 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Jan 16 21:16:10.838098 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 16 21:16:10.838111 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 16 21:16:10.838122 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Jan 16 21:16:10.838133 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Jan 16 21:16:10.838147 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Jan 16 21:16:10.838164 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Jan 16 21:16:10.838175 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 16 21:16:10.838186 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 16 21:16:10.838199 systemd[1]: Reached target remote-veritysetup.target - Remote Verity Protected Volumes. Jan 16 21:16:10.838211 systemd[1]: Reached target slices.target - Slice Units. Jan 16 21:16:10.838222 systemd[1]: Reached target swap.target - Swaps. Jan 16 21:16:10.838234 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Jan 16 21:16:10.838245 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Jan 16 21:16:10.838257 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Jan 16 21:16:10.838268 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Jan 16 21:16:10.838281 systemd[1]: Listening on systemd-mountfsd.socket - DDI File System Mounter Socket. Jan 16 21:16:10.838292 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 16 21:16:10.838304 systemd[1]: Listening on systemd-nsresourced.socket - Namespace Resource Manager Socket. Jan 16 21:16:10.838315 systemd[1]: Listening on systemd-oomd.socket - Userspace Out-Of-Memory (OOM) Killer Socket. Jan 16 21:16:10.838326 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 16 21:16:10.838337 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 16 21:16:10.838348 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Jan 16 21:16:10.838360 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Jan 16 21:16:10.838372 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Jan 16 21:16:10.838382 systemd[1]: Mounting media.mount - External Media Directory... Jan 16 21:16:10.838394 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 16 21:16:10.838404 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Jan 16 21:16:10.838415 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Jan 16 21:16:10.838426 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Jan 16 21:16:10.838440 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Jan 16 21:16:10.838451 systemd[1]: Reached target machines.target - Containers. Jan 16 21:16:10.838462 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Jan 16 21:16:10.838474 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 16 21:16:10.838484 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 16 21:16:10.838495 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Jan 16 21:16:10.838507 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 16 21:16:10.838520 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 16 21:16:10.838531 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 16 21:16:10.838542 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Jan 16 21:16:10.838555 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 16 21:16:10.838567 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Jan 16 21:16:10.838578 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Jan 16 21:16:10.838589 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Jan 16 21:16:10.840617 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Jan 16 21:16:10.840631 systemd[1]: Stopped systemd-fsck-usr.service. Jan 16 21:16:10.840644 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 16 21:16:10.840659 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 16 21:16:10.840670 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 16 21:16:10.840681 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 16 21:16:10.840693 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Jan 16 21:16:10.840704 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Jan 16 21:16:10.840715 kernel: fuse: init (API version 7.41) Jan 16 21:16:10.840726 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 16 21:16:10.840739 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 16 21:16:10.840751 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Jan 16 21:16:10.840762 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Jan 16 21:16:10.840773 systemd[1]: Mounted media.mount - External Media Directory. Jan 16 21:16:10.840785 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Jan 16 21:16:10.840796 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Jan 16 21:16:10.840810 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Jan 16 21:16:10.840821 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 16 21:16:10.840832 systemd[1]: modprobe@configfs.service: Deactivated successfully. Jan 16 21:16:10.840844 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Jan 16 21:16:10.840855 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 16 21:16:10.840868 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 16 21:16:10.840879 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 16 21:16:10.840891 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 16 21:16:10.840901 kernel: ACPI: bus type drm_connector registered Jan 16 21:16:10.840913 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 16 21:16:10.840924 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 16 21:16:10.840936 systemd[1]: modprobe@fuse.service: Deactivated successfully. Jan 16 21:16:10.840948 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Jan 16 21:16:10.840959 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 16 21:16:10.840971 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 16 21:16:10.840982 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 16 21:16:10.840993 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 16 21:16:10.841007 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Jan 16 21:16:10.841026 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 16 21:16:10.841038 systemd[1]: Listening on systemd-importd.socket - Disk Image Download Service Socket. Jan 16 21:16:10.841052 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Jan 16 21:16:10.841065 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 16 21:16:10.841078 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Jan 16 21:16:10.841113 systemd-journald[1274]: Collecting audit messages is enabled. Jan 16 21:16:10.841138 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 16 21:16:10.841151 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 16 21:16:10.841164 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Jan 16 21:16:10.841176 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 16 21:16:10.841189 systemd-journald[1274]: Journal started Jan 16 21:16:10.841212 systemd-journald[1274]: Runtime Journal (/run/log/journal/d25a18b3c144475eaaff4d6ad7796358) is 8M, max 77.9M, 69.9M free. Jan 16 21:16:10.586000 audit[1]: EVENT_LISTENER pid=1 uid=0 auid=4294967295 tty=(none) ses=4294967295 subj=system_u:system_r:kernel_t:s0 comm="systemd" exe="/usr/lib/systemd/systemd" nl-mcgrp=1 op=connect res=1 Jan 16 21:16:10.709000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:10.713000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:10.717000 audit: BPF prog-id=14 op=UNLOAD Jan 16 21:16:10.717000 audit: BPF prog-id=13 op=UNLOAD Jan 16 21:16:10.717000 audit: BPF prog-id=15 op=LOAD Jan 16 21:16:10.717000 audit: BPF prog-id=16 op=LOAD Jan 16 21:16:10.718000 audit: BPF prog-id=17 op=LOAD Jan 16 21:16:10.779000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:10.784000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:10.784000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:10.790000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:10.790000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:10.797000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:10.797000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:10.801000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:10.801000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:10.805000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:10.805000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:10.809000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:10.809000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:10.812000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:10.816000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:10.818000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-remount-fs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:10.833000 audit: CONFIG_CHANGE op=set audit_enabled=1 old=1 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 res=1 Jan 16 21:16:10.833000 audit[1274]: SYSCALL arch=c000003e syscall=46 success=yes exit=60 a0=3 a1=7ffd9149f4a0 a2=4000 a3=0 items=0 ppid=1 pid=1274 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="systemd-journal" exe="/usr/lib/systemd/systemd-journald" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:10.833000 audit: PROCTITLE proctitle="/usr/lib/systemd/systemd-journald" Jan 16 21:16:10.503758 systemd[1]: Queued start job for default target multi-user.target. Jan 16 21:16:10.528918 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Jan 16 21:16:10.529395 systemd[1]: systemd-journald.service: Deactivated successfully. Jan 16 21:16:10.847617 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Jan 16 21:16:10.851609 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 16 21:16:10.855610 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 16 21:16:10.860609 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Jan 16 21:16:10.864612 systemd[1]: Started systemd-journald.service - Journal Service. Jan 16 21:16:10.864000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:10.866698 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Jan 16 21:16:10.866000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=flatcar-tmpfiles comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:10.867682 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Jan 16 21:16:10.868000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-load-credentials comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:10.872679 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Jan 16 21:16:10.873000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-random-seed comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:10.884135 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Jan 16 21:16:10.889795 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Jan 16 21:16:10.902657 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Jan 16 21:16:10.907853 systemd[1]: Starting systemd-sysusers.service - Create System Users... Jan 16 21:16:10.920502 systemd-journald[1274]: Time spent on flushing to /var/log/journal/d25a18b3c144475eaaff4d6ad7796358 is 62.470ms for 1840 entries. Jan 16 21:16:10.920502 systemd-journald[1274]: System Journal (/var/log/journal/d25a18b3c144475eaaff4d6ad7796358) is 8M, max 588.1M, 580.1M free. Jan 16 21:16:10.995333 systemd-journald[1274]: Received client request to flush runtime journal. Jan 16 21:16:10.995377 kernel: loop1: detected capacity change from 0 to 50784 Jan 16 21:16:10.942000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:10.971000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:10.942334 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 16 21:16:10.971559 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 16 21:16:10.996673 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Jan 16 21:16:10.997000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-flush comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:11.000623 kernel: loop2: detected capacity change from 0 to 111560 Jan 16 21:16:11.012521 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Jan 16 21:16:11.012000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-machine-id-commit comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:11.021083 systemd[1]: Finished systemd-sysusers.service - Create System Users. Jan 16 21:16:11.021000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysusers comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:11.023000 audit: BPF prog-id=18 op=LOAD Jan 16 21:16:11.023000 audit: BPF prog-id=19 op=LOAD Jan 16 21:16:11.023000 audit: BPF prog-id=20 op=LOAD Jan 16 21:16:11.025842 systemd[1]: Starting systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer... Jan 16 21:16:11.026000 audit: BPF prog-id=21 op=LOAD Jan 16 21:16:11.029272 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 16 21:16:11.034821 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 16 21:16:11.041000 audit: BPF prog-id=22 op=LOAD Jan 16 21:16:11.041000 audit: BPF prog-id=23 op=LOAD Jan 16 21:16:11.041000 audit: BPF prog-id=24 op=LOAD Jan 16 21:16:11.042956 systemd[1]: Starting systemd-nsresourced.service - Namespace Resource Manager... Jan 16 21:16:11.044000 audit: BPF prog-id=25 op=LOAD Jan 16 21:16:11.044000 audit: BPF prog-id=26 op=LOAD Jan 16 21:16:11.044000 audit: BPF prog-id=27 op=LOAD Jan 16 21:16:11.046812 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Jan 16 21:16:11.058161 kernel: loop3: detected capacity change from 0 to 1656 Jan 16 21:16:11.088236 systemd-tmpfiles[1337]: ACLs are not supported, ignoring. Jan 16 21:16:11.088251 systemd-tmpfiles[1337]: ACLs are not supported, ignoring. Jan 16 21:16:11.089613 kernel: loop4: detected capacity change from 0 to 224512 Jan 16 21:16:11.096375 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 16 21:16:11.096000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:11.112148 systemd[1]: Started systemd-userdbd.service - User Database Manager. Jan 16 21:16:11.112000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-userdbd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:11.113521 systemd-nsresourced[1339]: Not setting up BPF subsystem, as functionality has been disabled at compile time. Jan 16 21:16:11.114482 systemd[1]: Started systemd-nsresourced.service - Namespace Resource Manager. Jan 16 21:16:11.114000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-nsresourced comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:11.145656 kernel: loop5: detected capacity change from 0 to 50784 Jan 16 21:16:11.178625 kernel: loop6: detected capacity change from 0 to 111560 Jan 16 21:16:11.201460 systemd-oomd[1335]: No swap; memory pressure usage will be degraded Jan 16 21:16:11.203982 systemd[1]: Started systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer. Jan 16 21:16:11.203000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-oomd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:11.208626 kernel: loop7: detected capacity change from 0 to 1656 Jan 16 21:16:11.214503 systemd-resolved[1336]: Positive Trust Anchors: Jan 16 21:16:11.214516 systemd-resolved[1336]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 16 21:16:11.214520 systemd-resolved[1336]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Jan 16 21:16:11.214550 systemd-resolved[1336]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 16 21:16:11.221536 kernel: loop1: detected capacity change from 0 to 224512 Jan 16 21:16:11.238125 systemd-resolved[1336]: Using system hostname 'ci-4580-0-0-p-be73a47b79'. Jan 16 21:16:11.239704 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 16 21:16:11.240000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:11.241219 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 16 21:16:11.249261 (sd-merge)[1359]: Using extensions 'containerd-flatcar.raw', 'docker-flatcar.raw', 'kubernetes.raw', 'oem-stackit.raw'. Jan 16 21:16:11.255568 (sd-merge)[1359]: Merged extensions into '/usr'. Jan 16 21:16:11.261701 systemd[1]: Reload requested from client PID 1300 ('systemd-sysext') (unit systemd-sysext.service)... Jan 16 21:16:11.261716 systemd[1]: Reloading... Jan 16 21:16:11.308623 zram_generator::config[1387]: No configuration found. Jan 16 21:16:11.520656 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Jan 16 21:16:11.520914 systemd[1]: Reloading finished in 258 ms. Jan 16 21:16:11.551583 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Jan 16 21:16:11.551000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-hwdb-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:11.553431 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Jan 16 21:16:11.553000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:11.558377 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Jan 16 21:16:11.559860 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Jan 16 21:16:11.570793 systemd[1]: Starting ensure-sysext.service... Jan 16 21:16:11.575000 audit: BPF prog-id=8 op=UNLOAD Jan 16 21:16:11.575000 audit: BPF prog-id=7 op=UNLOAD Jan 16 21:16:11.575731 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 16 21:16:11.576000 audit: BPF prog-id=28 op=LOAD Jan 16 21:16:11.576000 audit: BPF prog-id=29 op=LOAD Jan 16 21:16:11.578717 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 16 21:16:11.579000 audit: BPF prog-id=30 op=LOAD Jan 16 21:16:11.580000 audit: BPF prog-id=25 op=UNLOAD Jan 16 21:16:11.580000 audit: BPF prog-id=31 op=LOAD Jan 16 21:16:11.580000 audit: BPF prog-id=32 op=LOAD Jan 16 21:16:11.580000 audit: BPF prog-id=26 op=UNLOAD Jan 16 21:16:11.580000 audit: BPF prog-id=27 op=UNLOAD Jan 16 21:16:11.580000 audit: BPF prog-id=33 op=LOAD Jan 16 21:16:11.580000 audit: BPF prog-id=18 op=UNLOAD Jan 16 21:16:11.582000 audit: BPF prog-id=34 op=LOAD Jan 16 21:16:11.582000 audit: BPF prog-id=35 op=LOAD Jan 16 21:16:11.582000 audit: BPF prog-id=19 op=UNLOAD Jan 16 21:16:11.582000 audit: BPF prog-id=20 op=UNLOAD Jan 16 21:16:11.582000 audit: BPF prog-id=36 op=LOAD Jan 16 21:16:11.582000 audit: BPF prog-id=21 op=UNLOAD Jan 16 21:16:11.583000 audit: BPF prog-id=37 op=LOAD Jan 16 21:16:11.583000 audit: BPF prog-id=15 op=UNLOAD Jan 16 21:16:11.583000 audit: BPF prog-id=38 op=LOAD Jan 16 21:16:11.583000 audit: BPF prog-id=39 op=LOAD Jan 16 21:16:11.583000 audit: BPF prog-id=16 op=UNLOAD Jan 16 21:16:11.583000 audit: BPF prog-id=17 op=UNLOAD Jan 16 21:16:11.583000 audit: BPF prog-id=40 op=LOAD Jan 16 21:16:11.583000 audit: BPF prog-id=22 op=UNLOAD Jan 16 21:16:11.583000 audit: BPF prog-id=41 op=LOAD Jan 16 21:16:11.583000 audit: BPF prog-id=42 op=LOAD Jan 16 21:16:11.583000 audit: BPF prog-id=23 op=UNLOAD Jan 16 21:16:11.583000 audit: BPF prog-id=24 op=UNLOAD Jan 16 21:16:11.586900 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Jan 16 21:16:11.587489 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Jan 16 21:16:11.593705 systemd[1]: Reload requested from client PID 1436 ('systemctl') (unit ensure-sysext.service)... Jan 16 21:16:11.593721 systemd[1]: Reloading... Jan 16 21:16:11.610652 systemd-udevd[1438]: Using default interface naming scheme 'v257'. Jan 16 21:16:11.614816 systemd-tmpfiles[1437]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Jan 16 21:16:11.615111 systemd-tmpfiles[1437]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Jan 16 21:16:11.615402 systemd-tmpfiles[1437]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Jan 16 21:16:11.617327 systemd-tmpfiles[1437]: ACLs are not supported, ignoring. Jan 16 21:16:11.617709 systemd-tmpfiles[1437]: ACLs are not supported, ignoring. Jan 16 21:16:11.632939 systemd-tmpfiles[1437]: Detected autofs mount point /boot during canonicalization of boot. Jan 16 21:16:11.633083 systemd-tmpfiles[1437]: Skipping /boot Jan 16 21:16:11.642199 systemd-tmpfiles[1437]: Detected autofs mount point /boot during canonicalization of boot. Jan 16 21:16:11.642317 systemd-tmpfiles[1437]: Skipping /boot Jan 16 21:16:11.665633 zram_generator::config[1468]: No configuration found. Jan 16 21:16:11.796617 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input4 Jan 16 21:16:11.808618 kernel: mousedev: PS/2 mouse device common for all mice Jan 16 21:16:11.828635 kernel: ACPI: button: Power Button [PWRF] Jan 16 21:16:11.897425 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Jan 16 21:16:11.898471 systemd[1]: Reloading finished in 304 ms. Jan 16 21:16:11.906475 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 16 21:16:11.906000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:11.908000 audit: BPF prog-id=43 op=LOAD Jan 16 21:16:11.908000 audit: BPF prog-id=40 op=UNLOAD Jan 16 21:16:11.908000 audit: BPF prog-id=44 op=LOAD Jan 16 21:16:11.908000 audit: BPF prog-id=45 op=LOAD Jan 16 21:16:11.908000 audit: BPF prog-id=41 op=UNLOAD Jan 16 21:16:11.908000 audit: BPF prog-id=42 op=UNLOAD Jan 16 21:16:11.908000 audit: BPF prog-id=46 op=LOAD Jan 16 21:16:11.908000 audit: BPF prog-id=47 op=LOAD Jan 16 21:16:11.908000 audit: BPF prog-id=28 op=UNLOAD Jan 16 21:16:11.908000 audit: BPF prog-id=29 op=UNLOAD Jan 16 21:16:11.910000 audit: BPF prog-id=48 op=LOAD Jan 16 21:16:11.910000 audit: BPF prog-id=30 op=UNLOAD Jan 16 21:16:11.910000 audit: BPF prog-id=49 op=LOAD Jan 16 21:16:11.910000 audit: BPF prog-id=50 op=LOAD Jan 16 21:16:11.910000 audit: BPF prog-id=31 op=UNLOAD Jan 16 21:16:11.910000 audit: BPF prog-id=32 op=UNLOAD Jan 16 21:16:11.910000 audit: BPF prog-id=51 op=LOAD Jan 16 21:16:11.910000 audit: BPF prog-id=33 op=UNLOAD Jan 16 21:16:11.910000 audit: BPF prog-id=52 op=LOAD Jan 16 21:16:11.910000 audit: BPF prog-id=53 op=LOAD Jan 16 21:16:11.910000 audit: BPF prog-id=34 op=UNLOAD Jan 16 21:16:11.911000 audit: BPF prog-id=35 op=UNLOAD Jan 16 21:16:11.911000 audit: BPF prog-id=54 op=LOAD Jan 16 21:16:11.911000 audit: BPF prog-id=37 op=UNLOAD Jan 16 21:16:11.911000 audit: BPF prog-id=55 op=LOAD Jan 16 21:16:11.911000 audit: BPF prog-id=56 op=LOAD Jan 16 21:16:11.911000 audit: BPF prog-id=38 op=UNLOAD Jan 16 21:16:11.911000 audit: BPF prog-id=39 op=UNLOAD Jan 16 21:16:11.913000 audit: BPF prog-id=57 op=LOAD Jan 16 21:16:11.913000 audit: BPF prog-id=36 op=UNLOAD Jan 16 21:16:11.948550 kernel: i801_smbus 0000:00:1f.3: Enabling SMBus device Jan 16 21:16:11.948852 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Jan 16 21:16:11.950635 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Jan 16 21:16:11.968617 kernel: [drm] pci: virtio-vga detected at 0000:00:01.0 Jan 16 21:16:11.971725 kernel: Console: switching to colour dummy device 80x25 Jan 16 21:16:11.973626 kernel: virtio-pci 0000:00:01.0: vgaarb: deactivate vga console Jan 16 21:16:11.973856 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Jan 16 21:16:11.973879 kernel: [drm] features: -context_init Jan 16 21:16:11.979909 kernel: [drm] number of scanouts: 1 Jan 16 21:16:11.979970 kernel: [drm] number of cap sets: 0 Jan 16 21:16:11.980784 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0 Jan 16 21:16:11.981086 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 16 21:16:11.980000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:12.021670 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device Jan 16 21:16:12.040324 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Jan 16 21:16:12.084625 kernel: Console: switching to colour frame buffer device 160x50 Jan 16 21:16:12.089646 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Jan 16 21:16:12.098970 systemd[1]: Finished ensure-sysext.service. Jan 16 21:16:12.098000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=ensure-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:12.110102 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 16 21:16:12.112842 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 16 21:16:12.115520 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Jan 16 21:16:12.116548 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 16 21:16:12.119899 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 16 21:16:12.123878 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 16 21:16:12.127945 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 16 21:16:12.130325 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 16 21:16:12.135897 systemd[1]: Starting modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm... Jan 16 21:16:12.136117 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 16 21:16:12.136274 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 16 21:16:12.140662 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Jan 16 21:16:12.145818 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Jan 16 21:16:12.145912 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 16 21:16:12.150825 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Jan 16 21:16:12.151000 audit: BPF prog-id=58 op=LOAD Jan 16 21:16:12.156147 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 16 21:16:12.157870 systemd[1]: Reached target time-set.target - System Time Set. Jan 16 21:16:12.161282 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Jan 16 21:16:12.166720 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 16 21:16:12.167724 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 16 21:16:12.186018 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 16 21:16:12.186194 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 16 21:16:12.193000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:12.193000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:12.211000 audit[1572]: SYSTEM_BOOT pid=1572 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg=' comm="systemd-update-utmp" exe="/usr/lib/systemd/systemd-update-utmp" hostname=? addr=? terminal=? res=success' Jan 16 21:16:12.218712 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Jan 16 21:16:12.218000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-update-utmp comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:12.226479 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 16 21:16:12.227039 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 16 21:16:12.227000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:12.227000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:12.228871 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 16 21:16:12.229098 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 16 21:16:12.229000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:12.229000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:12.230947 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Jan 16 21:16:12.231000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck@dev-disk-by\x2dlabel-OEM comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:12.232637 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 16 21:16:12.237862 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 16 21:16:12.237000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:12.237000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:12.238325 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 16 21:16:12.238000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:12.238000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:12.238489 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 16 21:16:12.243179 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 16 21:16:12.243308 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 16 21:16:12.245873 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 16 21:16:12.283616 kernel: pps_core: LinuxPPS API ver. 1 registered Jan 16 21:16:12.283706 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Jan 16 21:16:12.288640 kernel: PTP clock support registered Jan 16 21:16:12.297947 systemd[1]: modprobe@ptp_kvm.service: Deactivated successfully. Jan 16 21:16:12.298685 systemd[1]: Finished modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm. Jan 16 21:16:12.301000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@ptp_kvm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:12.301000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@ptp_kvm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:12.326296 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Jan 16 21:16:12.327000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-catalog-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:12.327000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=add_rule key=(null) list=5 res=1 Jan 16 21:16:12.327000 audit[1613]: SYSCALL arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7ffc0a4bd4f0 a2=420 a3=0 items=0 ppid=1561 pid=1613 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:12.327000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 16 21:16:12.329225 augenrules[1613]: No rules Jan 16 21:16:12.331446 systemd[1]: audit-rules.service: Deactivated successfully. Jan 16 21:16:12.331783 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 16 21:16:12.335472 systemd-networkd[1571]: lo: Link UP Jan 16 21:16:12.335628 systemd-networkd[1571]: lo: Gained carrier Jan 16 21:16:12.337324 systemd-networkd[1571]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 16 21:16:12.337395 systemd-networkd[1571]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 16 21:16:12.337418 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 16 21:16:12.339173 systemd-networkd[1571]: eth0: Link UP Jan 16 21:16:12.339381 systemd[1]: Reached target network.target - Network. Jan 16 21:16:12.339451 systemd-networkd[1571]: eth0: Gained carrier Jan 16 21:16:12.339500 systemd-networkd[1571]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 16 21:16:12.342279 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Jan 16 21:16:12.346730 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Jan 16 21:16:12.351693 systemd-networkd[1571]: eth0: DHCPv4 address 10.0.3.156/25, gateway 10.0.3.129 acquired from 10.0.3.129 Jan 16 21:16:12.370738 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Jan 16 21:16:12.389199 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Jan 16 21:16:12.393884 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Jan 16 21:16:12.406663 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 16 21:16:13.098989 ldconfig[1568]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Jan 16 21:16:13.105442 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Jan 16 21:16:13.107995 systemd[1]: Starting systemd-update-done.service - Update is Completed... Jan 16 21:16:13.130985 systemd[1]: Finished systemd-update-done.service - Update is Completed. Jan 16 21:16:13.133305 systemd[1]: Reached target sysinit.target - System Initialization. Jan 16 21:16:13.134316 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Jan 16 21:16:13.135672 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Jan 16 21:16:13.136165 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Jan 16 21:16:13.137080 systemd[1]: Started logrotate.timer - Daily rotation of log files. Jan 16 21:16:13.137582 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Jan 16 21:16:13.138013 systemd[1]: Started systemd-sysupdate-reboot.timer - Reboot Automatically After System Update. Jan 16 21:16:13.138659 systemd[1]: Started systemd-sysupdate.timer - Automatic System Update. Jan 16 21:16:13.139201 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Jan 16 21:16:13.148277 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Jan 16 21:16:13.148329 systemd[1]: Reached target paths.target - Path Units. Jan 16 21:16:13.149064 systemd[1]: Reached target timers.target - Timer Units. Jan 16 21:16:13.151886 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Jan 16 21:16:13.153575 systemd[1]: Starting docker.socket - Docker Socket for the API... Jan 16 21:16:13.155944 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Jan 16 21:16:13.157784 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Jan 16 21:16:13.158180 systemd[1]: Reached target ssh-access.target - SSH Access Available. Jan 16 21:16:13.166016 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Jan 16 21:16:13.167047 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Jan 16 21:16:13.168547 systemd[1]: Listening on docker.socket - Docker Socket for the API. Jan 16 21:16:13.171056 systemd[1]: Reached target sockets.target - Socket Units. Jan 16 21:16:13.171440 systemd[1]: Reached target basic.target - Basic System. Jan 16 21:16:13.173897 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Jan 16 21:16:13.173923 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Jan 16 21:16:13.177729 systemd[1]: Starting chronyd.service - NTP client/server... Jan 16 21:16:13.180682 systemd[1]: Starting containerd.service - containerd container runtime... Jan 16 21:16:13.191112 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Jan 16 21:16:13.194862 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Jan 16 21:16:13.197799 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Jan 16 21:16:13.206587 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Jan 16 21:16:13.211090 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Jan 16 21:16:13.212372 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Jan 16 21:16:13.214875 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Jan 16 21:16:13.215661 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 16 21:16:13.221526 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Jan 16 21:16:13.232772 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Jan 16 21:16:13.237378 jq[1635]: false Jan 16 21:16:13.237640 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Jan 16 21:16:13.244130 google_oslogin_nss_cache[1638]: oslogin_cache_refresh[1638]: Refreshing passwd entry cache Jan 16 21:16:13.244136 oslogin_cache_refresh[1638]: Refreshing passwd entry cache Jan 16 21:16:13.246795 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Jan 16 21:16:13.250055 systemd[1]: Starting systemd-logind.service - User Login Management... Jan 16 21:16:13.253516 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Jan 16 21:16:13.254010 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Jan 16 21:16:13.257722 systemd[1]: Starting update-engine.service - Update Engine... Jan 16 21:16:13.259723 extend-filesystems[1636]: Found /dev/vda6 Jan 16 21:16:13.269615 google_oslogin_nss_cache[1638]: oslogin_cache_refresh[1638]: Failure getting users, quitting Jan 16 21:16:13.269615 google_oslogin_nss_cache[1638]: oslogin_cache_refresh[1638]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Jan 16 21:16:13.269615 google_oslogin_nss_cache[1638]: oslogin_cache_refresh[1638]: Refreshing group entry cache Jan 16 21:16:13.266103 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Jan 16 21:16:13.263564 oslogin_cache_refresh[1638]: Failure getting users, quitting Jan 16 21:16:13.263583 oslogin_cache_refresh[1638]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Jan 16 21:16:13.263663 oslogin_cache_refresh[1638]: Refreshing group entry cache Jan 16 21:16:13.278331 google_oslogin_nss_cache[1638]: oslogin_cache_refresh[1638]: Failure getting groups, quitting Jan 16 21:16:13.278331 google_oslogin_nss_cache[1638]: oslogin_cache_refresh[1638]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Jan 16 21:16:13.274797 oslogin_cache_refresh[1638]: Failure getting groups, quitting Jan 16 21:16:13.274808 oslogin_cache_refresh[1638]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Jan 16 21:16:13.278570 jq[1651]: true Jan 16 21:16:13.279413 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Jan 16 21:16:13.281238 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Jan 16 21:16:13.281432 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Jan 16 21:16:13.281856 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Jan 16 21:16:13.282724 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Jan 16 21:16:13.284243 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Jan 16 21:16:13.284423 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Jan 16 21:16:13.296756 extend-filesystems[1636]: Found /dev/vda9 Jan 16 21:16:13.300329 extend-filesystems[1636]: Checking size of /dev/vda9 Jan 16 21:16:13.306507 systemd[1]: motdgen.service: Deactivated successfully. Jan 16 21:16:13.308743 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Jan 16 21:16:13.325557 extend-filesystems[1636]: Resized partition /dev/vda9 Jan 16 21:16:13.324412 systemd[1]: Started chronyd.service - NTP client/server. Jan 16 21:16:13.323530 chronyd[1630]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +NTS +SECHASH +IPV6 -DEBUG) Jan 16 21:16:13.330159 extend-filesystems[1684]: resize2fs 1.47.3 (8-Jul-2025) Jan 16 21:16:13.334636 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 11516923 blocks Jan 16 21:16:13.334676 jq[1662]: true Jan 16 21:16:13.324238 chronyd[1630]: Loaded seccomp filter (level 2) Jan 16 21:16:13.334913 update_engine[1649]: I20260116 21:16:13.333318 1649 main.cc:92] Flatcar Update Engine starting Jan 16 21:16:13.346984 tar[1660]: linux-amd64/LICENSE Jan 16 21:16:13.373210 dbus-daemon[1633]: [system] SELinux support is enabled Jan 16 21:16:13.373441 systemd[1]: Started dbus.service - D-Bus System Message Bus. Jan 16 21:16:13.396755 tar[1660]: linux-amd64/helm Jan 16 21:16:13.379081 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Jan 16 21:16:13.379105 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Jan 16 21:16:13.382904 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Jan 16 21:16:13.382925 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Jan 16 21:16:13.402980 systemd[1]: Started update-engine.service - Update Engine. Jan 16 21:16:13.458180 update_engine[1649]: I20260116 21:16:13.402989 1649 update_check_scheduler.cc:74] Next update check in 4m41s Jan 16 21:16:13.407249 systemd[1]: Started locksmithd.service - Cluster reboot manager. Jan 16 21:16:13.448590 systemd-logind[1647]: New seat seat0. Jan 16 21:16:13.457067 systemd-logind[1647]: Watching system buttons on /dev/input/event3 (Power Button) Jan 16 21:16:13.457083 systemd-logind[1647]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Jan 16 21:16:13.461977 systemd[1]: Started systemd-logind.service - User Login Management. Jan 16 21:16:13.519752 locksmithd[1704]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Jan 16 21:16:13.607141 sshd_keygen[1681]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Jan 16 21:16:13.629901 bash[1701]: Updated "/home/core/.ssh/authorized_keys" Jan 16 21:16:13.631166 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Jan 16 21:16:13.636411 containerd[1678]: time="2026-01-16T21:16:13Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Jan 16 21:16:13.638770 systemd[1]: Starting sshkeys.service... Jan 16 21:16:13.643027 containerd[1678]: time="2026-01-16T21:16:13.641215185Z" level=info msg="starting containerd" revision=fcd43222d6b07379a4be9786bda52438f0dd16a1 version=v2.1.5 Jan 16 21:16:13.641280 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Jan 16 21:16:13.650524 systemd[1]: Starting issuegen.service - Generate /run/issue... Jan 16 21:16:13.660446 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Jan 16 21:16:13.663144 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Jan 16 21:16:13.678277 containerd[1678]: time="2026-01-16T21:16:13.678239328Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="11.782µs" Jan 16 21:16:13.678622 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 16 21:16:13.678735 containerd[1678]: time="2026-01-16T21:16:13.678716245Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Jan 16 21:16:13.678913 containerd[1678]: time="2026-01-16T21:16:13.678900159Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Jan 16 21:16:13.679230 containerd[1678]: time="2026-01-16T21:16:13.679215138Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Jan 16 21:16:13.679864 containerd[1678]: time="2026-01-16T21:16:13.679416453Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Jan 16 21:16:13.679864 containerd[1678]: time="2026-01-16T21:16:13.679434378Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jan 16 21:16:13.679864 containerd[1678]: time="2026-01-16T21:16:13.679493106Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jan 16 21:16:13.679864 containerd[1678]: time="2026-01-16T21:16:13.679503667Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jan 16 21:16:13.680873 containerd[1678]: time="2026-01-16T21:16:13.680852434Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jan 16 21:16:13.680935 containerd[1678]: time="2026-01-16T21:16:13.680925679Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jan 16 21:16:13.681041 containerd[1678]: time="2026-01-16T21:16:13.681029651Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jan 16 21:16:13.681106 containerd[1678]: time="2026-01-16T21:16:13.681097370Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Jan 16 21:16:13.681729 containerd[1678]: time="2026-01-16T21:16:13.681712428Z" level=info msg="skip loading plugin" error="EROFS unsupported, please `modprobe erofs`: skip plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Jan 16 21:16:13.682876 containerd[1678]: time="2026-01-16T21:16:13.681841658Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Jan 16 21:16:13.682876 containerd[1678]: time="2026-01-16T21:16:13.681916287Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Jan 16 21:16:13.683296 containerd[1678]: time="2026-01-16T21:16:13.683280228Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jan 16 21:16:13.683360 containerd[1678]: time="2026-01-16T21:16:13.683349264Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jan 16 21:16:13.683888 containerd[1678]: time="2026-01-16T21:16:13.683872370Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Jan 16 21:16:13.684069 systemd[1]: issuegen.service: Deactivated successfully. Jan 16 21:16:13.684700 containerd[1678]: time="2026-01-16T21:16:13.684457436Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Jan 16 21:16:13.684515 systemd[1]: Finished issuegen.service - Generate /run/issue. Jan 16 21:16:13.687072 containerd[1678]: time="2026-01-16T21:16:13.686815926Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Jan 16 21:16:13.687930 containerd[1678]: time="2026-01-16T21:16:13.687770912Z" level=info msg="metadata content store policy set" policy=shared Jan 16 21:16:13.688485 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Jan 16 21:16:13.713802 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Jan 16 21:16:13.718426 systemd[1]: Started getty@tty1.service - Getty on tty1. Jan 16 21:16:13.721921 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Jan 16 21:16:13.722616 systemd[1]: Reached target getty.target - Login Prompts. Jan 16 21:16:13.752629 containerd[1678]: time="2026-01-16T21:16:13.752573192Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Jan 16 21:16:13.752841 containerd[1678]: time="2026-01-16T21:16:13.752752368Z" level=info msg="loading plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Jan 16 21:16:13.754143 containerd[1678]: time="2026-01-16T21:16:13.753209981Z" level=info msg="skip loading plugin" error="could not find mkfs.erofs: exec: \"mkfs.erofs\": executable file not found in $PATH: skip plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Jan 16 21:16:13.754143 containerd[1678]: time="2026-01-16T21:16:13.753229697Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Jan 16 21:16:13.754143 containerd[1678]: time="2026-01-16T21:16:13.753243386Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Jan 16 21:16:13.754143 containerd[1678]: time="2026-01-16T21:16:13.753254181Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Jan 16 21:16:13.754143 containerd[1678]: time="2026-01-16T21:16:13.753264831Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Jan 16 21:16:13.754143 containerd[1678]: time="2026-01-16T21:16:13.753273065Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Jan 16 21:16:13.754143 containerd[1678]: time="2026-01-16T21:16:13.753283286Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Jan 16 21:16:13.754143 containerd[1678]: time="2026-01-16T21:16:13.753294044Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Jan 16 21:16:13.754143 containerd[1678]: time="2026-01-16T21:16:13.753304035Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Jan 16 21:16:13.754143 containerd[1678]: time="2026-01-16T21:16:13.753314505Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Jan 16 21:16:13.754143 containerd[1678]: time="2026-01-16T21:16:13.753327794Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Jan 16 21:16:13.754143 containerd[1678]: time="2026-01-16T21:16:13.753339114Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Jan 16 21:16:13.754143 containerd[1678]: time="2026-01-16T21:16:13.753457219Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Jan 16 21:16:13.754395 containerd[1678]: time="2026-01-16T21:16:13.753473734Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Jan 16 21:16:13.754395 containerd[1678]: time="2026-01-16T21:16:13.753497387Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Jan 16 21:16:13.754395 containerd[1678]: time="2026-01-16T21:16:13.753507889Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Jan 16 21:16:13.754395 containerd[1678]: time="2026-01-16T21:16:13.753517352Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Jan 16 21:16:13.754395 containerd[1678]: time="2026-01-16T21:16:13.753526123Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Jan 16 21:16:13.754395 containerd[1678]: time="2026-01-16T21:16:13.753537018Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Jan 16 21:16:13.754395 containerd[1678]: time="2026-01-16T21:16:13.753547303Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Jan 16 21:16:13.754395 containerd[1678]: time="2026-01-16T21:16:13.753557249Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Jan 16 21:16:13.754395 containerd[1678]: time="2026-01-16T21:16:13.753566622Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Jan 16 21:16:13.754395 containerd[1678]: time="2026-01-16T21:16:13.753575895Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Jan 16 21:16:13.754395 containerd[1678]: time="2026-01-16T21:16:13.753606421Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Jan 16 21:16:13.754395 containerd[1678]: time="2026-01-16T21:16:13.753644739Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Jan 16 21:16:13.754395 containerd[1678]: time="2026-01-16T21:16:13.753656378Z" level=info msg="Start snapshots syncer" Jan 16 21:16:13.754395 containerd[1678]: time="2026-01-16T21:16:13.753680438Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Jan 16 21:16:13.754642 containerd[1678]: time="2026-01-16T21:16:13.753929814Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"cgroupWritable\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"\",\"binDirs\":[\"/opt/cni/bin\"],\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogLineSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Jan 16 21:16:13.754642 containerd[1678]: time="2026-01-16T21:16:13.753972713Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Jan 16 21:16:13.754756 containerd[1678]: time="2026-01-16T21:16:13.754013422Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Jan 16 21:16:13.754756 containerd[1678]: time="2026-01-16T21:16:13.754095355Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Jan 16 21:16:13.755008 containerd[1678]: time="2026-01-16T21:16:13.754971260Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Jan 16 21:16:13.755061 containerd[1678]: time="2026-01-16T21:16:13.755046724Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Jan 16 21:16:13.755103 containerd[1678]: time="2026-01-16T21:16:13.755095540Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Jan 16 21:16:13.755156 containerd[1678]: time="2026-01-16T21:16:13.755145641Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Jan 16 21:16:13.755739 containerd[1678]: time="2026-01-16T21:16:13.755713755Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Jan 16 21:16:13.755868 containerd[1678]: time="2026-01-16T21:16:13.755749415Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Jan 16 21:16:13.755868 containerd[1678]: time="2026-01-16T21:16:13.755765438Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Jan 16 21:16:13.755868 containerd[1678]: time="2026-01-16T21:16:13.755777273Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Jan 16 21:16:13.755868 containerd[1678]: time="2026-01-16T21:16:13.755841644Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jan 16 21:16:13.755868 containerd[1678]: time="2026-01-16T21:16:13.755859673Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jan 16 21:16:13.755951 containerd[1678]: time="2026-01-16T21:16:13.755868738Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jan 16 21:16:13.755951 containerd[1678]: time="2026-01-16T21:16:13.755881659Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jan 16 21:16:13.755951 containerd[1678]: time="2026-01-16T21:16:13.755896797Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Jan 16 21:16:13.755951 containerd[1678]: time="2026-01-16T21:16:13.755908851Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Jan 16 21:16:13.755951 containerd[1678]: time="2026-01-16T21:16:13.755918714Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Jan 16 21:16:13.755951 containerd[1678]: time="2026-01-16T21:16:13.755939817Z" level=info msg="runtime interface created" Jan 16 21:16:13.755951 containerd[1678]: time="2026-01-16T21:16:13.755945181Z" level=info msg="created NRI interface" Jan 16 21:16:13.756067 containerd[1678]: time="2026-01-16T21:16:13.755956172Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Jan 16 21:16:13.756067 containerd[1678]: time="2026-01-16T21:16:13.755972626Z" level=info msg="Connect containerd service" Jan 16 21:16:13.756067 containerd[1678]: time="2026-01-16T21:16:13.756000598Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Jan 16 21:16:13.757113 containerd[1678]: time="2026-01-16T21:16:13.757090966Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 16 21:16:13.873578 containerd[1678]: time="2026-01-16T21:16:13.873356843Z" level=info msg="Start subscribing containerd event" Jan 16 21:16:13.873578 containerd[1678]: time="2026-01-16T21:16:13.873403156Z" level=info msg="Start recovering state" Jan 16 21:16:13.873578 containerd[1678]: time="2026-01-16T21:16:13.873517882Z" level=info msg="Start event monitor" Jan 16 21:16:13.873578 containerd[1678]: time="2026-01-16T21:16:13.873529968Z" level=info msg="Start cni network conf syncer for default" Jan 16 21:16:13.873578 containerd[1678]: time="2026-01-16T21:16:13.873540180Z" level=info msg="Start streaming server" Jan 16 21:16:13.873578 containerd[1678]: time="2026-01-16T21:16:13.873547279Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Jan 16 21:16:13.873578 containerd[1678]: time="2026-01-16T21:16:13.873553683Z" level=info msg="runtime interface starting up..." Jan 16 21:16:13.873578 containerd[1678]: time="2026-01-16T21:16:13.873560017Z" level=info msg="starting plugins..." Jan 16 21:16:13.874697 containerd[1678]: time="2026-01-16T21:16:13.873710270Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Jan 16 21:16:13.875216 containerd[1678]: time="2026-01-16T21:16:13.875196036Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Jan 16 21:16:13.875326 containerd[1678]: time="2026-01-16T21:16:13.875316477Z" level=info msg=serving... address=/run/containerd/containerd.sock Jan 16 21:16:13.876383 containerd[1678]: time="2026-01-16T21:16:13.876367744Z" level=info msg="containerd successfully booted in 0.239329s" Jan 16 21:16:13.877258 systemd[1]: Started containerd.service - containerd container runtime. Jan 16 21:16:13.946630 kernel: EXT4-fs (vda9): resized filesystem to 11516923 Jan 16 21:16:13.976115 extend-filesystems[1684]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Jan 16 21:16:13.976115 extend-filesystems[1684]: old_desc_blocks = 1, new_desc_blocks = 6 Jan 16 21:16:13.976115 extend-filesystems[1684]: The filesystem on /dev/vda9 is now 11516923 (4k) blocks long. Jan 16 21:16:13.981526 extend-filesystems[1636]: Resized filesystem in /dev/vda9 Jan 16 21:16:13.977449 systemd[1]: extend-filesystems.service: Deactivated successfully. Jan 16 21:16:13.977732 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Jan 16 21:16:14.060855 tar[1660]: linux-amd64/README.md Jan 16 21:16:14.077453 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Jan 16 21:16:14.246662 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 16 21:16:14.310426 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Jan 16 21:16:14.312936 systemd[1]: Started sshd@0-10.0.3.156:22-4.153.228.146:55884.service - OpenSSH per-connection server daemon (4.153.228.146:55884). Jan 16 21:16:14.331801 systemd-networkd[1571]: eth0: Gained IPv6LL Jan 16 21:16:14.337065 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Jan 16 21:16:14.339277 systemd[1]: Reached target network-online.target - Network is Online. Jan 16 21:16:14.342879 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 16 21:16:14.345089 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Jan 16 21:16:14.389224 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Jan 16 21:16:14.697795 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 16 21:16:14.904638 sshd[1757]: Accepted publickey for core from 4.153.228.146 port 55884 ssh2: RSA SHA256:bS5ZIeZwWVVoAGnX02c/6nLNZbVn3G5kxaKDLP4fvzs Jan 16 21:16:14.909142 sshd-session[1757]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 21:16:14.924248 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Jan 16 21:16:14.928890 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Jan 16 21:16:14.933895 systemd-logind[1647]: New session 1 of user core. Jan 16 21:16:14.966013 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Jan 16 21:16:14.970573 systemd[1]: Starting user@500.service - User Manager for UID 500... Jan 16 21:16:14.990071 (systemd)[1776]: pam_unix(systemd-user:session): session opened for user core(uid=500) by core(uid=0) Jan 16 21:16:14.993072 systemd-logind[1647]: New session 2 of user core. Jan 16 21:16:15.125018 systemd[1776]: Queued start job for default target default.target. Jan 16 21:16:15.129542 systemd[1776]: Created slice app.slice - User Application Slice. Jan 16 21:16:15.129571 systemd[1776]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of User's Temporary Directories. Jan 16 21:16:15.129584 systemd[1776]: Reached target paths.target - Paths. Jan 16 21:16:15.129712 systemd[1776]: Reached target timers.target - Timers. Jan 16 21:16:15.131798 systemd[1776]: Starting dbus.socket - D-Bus User Message Bus Socket... Jan 16 21:16:15.135404 systemd[1776]: Starting systemd-tmpfiles-setup.service - Create User Files and Directories... Jan 16 21:16:15.153107 systemd[1776]: Listening on dbus.socket - D-Bus User Message Bus Socket. Jan 16 21:16:15.153367 systemd[1776]: Finished systemd-tmpfiles-setup.service - Create User Files and Directories. Jan 16 21:16:15.153473 systemd[1776]: Reached target sockets.target - Sockets. Jan 16 21:16:15.153511 systemd[1776]: Reached target basic.target - Basic System. Jan 16 21:16:15.153542 systemd[1776]: Reached target default.target - Main User Target. Jan 16 21:16:15.153568 systemd[1776]: Startup finished in 150ms. Jan 16 21:16:15.153994 systemd[1]: Started user@500.service - User Manager for UID 500. Jan 16 21:16:15.163046 systemd[1]: Started session-1.scope - Session 1 of User core. Jan 16 21:16:15.482328 systemd[1]: Started sshd@1-10.0.3.156:22-4.153.228.146:47698.service - OpenSSH per-connection server daemon (4.153.228.146:47698). Jan 16 21:16:15.637064 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 16 21:16:15.645046 (kubelet)[1798]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 16 21:16:16.037708 sshd[1790]: Accepted publickey for core from 4.153.228.146 port 47698 ssh2: RSA SHA256:bS5ZIeZwWVVoAGnX02c/6nLNZbVn3G5kxaKDLP4fvzs Jan 16 21:16:16.038294 sshd-session[1790]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 21:16:16.045810 systemd-logind[1647]: New session 3 of user core. Jan 16 21:16:16.049936 systemd[1]: Started session-3.scope - Session 3 of User core. Jan 16 21:16:16.258621 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 16 21:16:16.341700 sshd[1804]: Connection closed by 4.153.228.146 port 47698 Jan 16 21:16:16.342168 sshd-session[1790]: pam_unix(sshd:session): session closed for user core Jan 16 21:16:16.347328 systemd-logind[1647]: Session 3 logged out. Waiting for processes to exit. Jan 16 21:16:16.347364 systemd[1]: sshd@1-10.0.3.156:22-4.153.228.146:47698.service: Deactivated successfully. Jan 16 21:16:16.352524 systemd[1]: session-3.scope: Deactivated successfully. Jan 16 21:16:16.362471 systemd-logind[1647]: Removed session 3. Jan 16 21:16:16.454808 systemd[1]: Started sshd@2-10.0.3.156:22-4.153.228.146:47702.service - OpenSSH per-connection server daemon (4.153.228.146:47702). Jan 16 21:16:16.534475 kubelet[1798]: E0116 21:16:16.534409 1798 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 16 21:16:16.536902 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 16 21:16:16.537053 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 16 21:16:16.537479 systemd[1]: kubelet.service: Consumed 997ms CPU time, 263.8M memory peak. Jan 16 21:16:16.707606 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 16 21:16:17.000363 sshd[1811]: Accepted publickey for core from 4.153.228.146 port 47702 ssh2: RSA SHA256:bS5ZIeZwWVVoAGnX02c/6nLNZbVn3G5kxaKDLP4fvzs Jan 16 21:16:17.001287 sshd-session[1811]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 21:16:17.006971 systemd-logind[1647]: New session 4 of user core. Jan 16 21:16:17.012840 systemd[1]: Started session-4.scope - Session 4 of User core. Jan 16 21:16:17.314031 sshd[1818]: Connection closed by 4.153.228.146 port 47702 Jan 16 21:16:17.314635 sshd-session[1811]: pam_unix(sshd:session): session closed for user core Jan 16 21:16:17.318402 systemd[1]: sshd@2-10.0.3.156:22-4.153.228.146:47702.service: Deactivated successfully. Jan 16 21:16:17.319868 systemd[1]: session-4.scope: Deactivated successfully. Jan 16 21:16:17.320510 systemd-logind[1647]: Session 4 logged out. Waiting for processes to exit. Jan 16 21:16:17.321541 systemd-logind[1647]: Removed session 4. Jan 16 21:16:20.269635 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 16 21:16:20.278866 coreos-metadata[1632]: Jan 16 21:16:20.278 WARN failed to locate config-drive, using the metadata service API instead Jan 16 21:16:20.294992 coreos-metadata[1632]: Jan 16 21:16:20.294 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Jan 16 21:16:20.621216 coreos-metadata[1632]: Jan 16 21:16:20.621 INFO Fetch successful Jan 16 21:16:20.621216 coreos-metadata[1632]: Jan 16 21:16:20.621 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Jan 16 21:16:20.724951 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 16 21:16:20.734878 coreos-metadata[1728]: Jan 16 21:16:20.734 WARN failed to locate config-drive, using the metadata service API instead Jan 16 21:16:20.742389 coreos-metadata[1632]: Jan 16 21:16:20.742 INFO Fetch successful Jan 16 21:16:20.742389 coreos-metadata[1632]: Jan 16 21:16:20.742 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Jan 16 21:16:20.747474 coreos-metadata[1728]: Jan 16 21:16:20.747 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Jan 16 21:16:21.009804 coreos-metadata[1632]: Jan 16 21:16:21.009 INFO Fetch successful Jan 16 21:16:21.009804 coreos-metadata[1632]: Jan 16 21:16:21.009 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Jan 16 21:16:21.012204 coreos-metadata[1728]: Jan 16 21:16:21.012 INFO Fetch successful Jan 16 21:16:21.012204 coreos-metadata[1728]: Jan 16 21:16:21.012 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Jan 16 21:16:21.238891 coreos-metadata[1632]: Jan 16 21:16:21.238 INFO Fetch successful Jan 16 21:16:21.238979 coreos-metadata[1632]: Jan 16 21:16:21.238 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Jan 16 21:16:21.241269 coreos-metadata[1728]: Jan 16 21:16:21.241 INFO Fetch successful Jan 16 21:16:21.243727 unknown[1728]: wrote ssh authorized keys file for user: core Jan 16 21:16:21.272903 update-ssh-keys[1832]: Updated "/home/core/.ssh/authorized_keys" Jan 16 21:16:21.274080 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Jan 16 21:16:21.277511 systemd[1]: Finished sshkeys.service. Jan 16 21:16:21.526484 coreos-metadata[1632]: Jan 16 21:16:21.526 INFO Fetch successful Jan 16 21:16:21.526484 coreos-metadata[1632]: Jan 16 21:16:21.526 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Jan 16 21:16:21.678247 coreos-metadata[1632]: Jan 16 21:16:21.678 INFO Fetch successful Jan 16 21:16:21.706279 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Jan 16 21:16:21.706881 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Jan 16 21:16:21.707135 systemd[1]: Reached target multi-user.target - Multi-User System. Jan 16 21:16:21.709669 systemd[1]: Startup finished in 3.634s (kernel) + 13.038s (initrd) + 12.044s (userspace) = 28.718s. Jan 16 21:16:26.642106 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Jan 16 21:16:26.643860 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 16 21:16:26.792013 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 16 21:16:26.796203 (kubelet)[1848]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 16 21:16:26.834649 kubelet[1848]: E0116 21:16:26.834574 1848 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 16 21:16:26.838083 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 16 21:16:26.838216 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 16 21:16:26.838538 systemd[1]: kubelet.service: Consumed 151ms CPU time, 110.2M memory peak. Jan 16 21:16:27.427772 systemd[1]: Started sshd@3-10.0.3.156:22-4.153.228.146:36026.service - OpenSSH per-connection server daemon (4.153.228.146:36026). Jan 16 21:16:27.977635 sshd[1856]: Accepted publickey for core from 4.153.228.146 port 36026 ssh2: RSA SHA256:bS5ZIeZwWVVoAGnX02c/6nLNZbVn3G5kxaKDLP4fvzs Jan 16 21:16:27.978509 sshd-session[1856]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 21:16:27.982682 systemd-logind[1647]: New session 5 of user core. Jan 16 21:16:27.994167 systemd[1]: Started session-5.scope - Session 5 of User core. Jan 16 21:16:28.280986 sshd[1860]: Connection closed by 4.153.228.146 port 36026 Jan 16 21:16:28.280770 sshd-session[1856]: pam_unix(sshd:session): session closed for user core Jan 16 21:16:28.286042 systemd[1]: sshd@3-10.0.3.156:22-4.153.228.146:36026.service: Deactivated successfully. Jan 16 21:16:28.288132 systemd[1]: session-5.scope: Deactivated successfully. Jan 16 21:16:28.289746 systemd-logind[1647]: Session 5 logged out. Waiting for processes to exit. Jan 16 21:16:28.290461 systemd-logind[1647]: Removed session 5. Jan 16 21:16:28.394619 systemd[1]: Started sshd@4-10.0.3.156:22-4.153.228.146:36042.service - OpenSSH per-connection server daemon (4.153.228.146:36042). Jan 16 21:16:28.939437 sshd[1866]: Accepted publickey for core from 4.153.228.146 port 36042 ssh2: RSA SHA256:bS5ZIeZwWVVoAGnX02c/6nLNZbVn3G5kxaKDLP4fvzs Jan 16 21:16:28.940722 sshd-session[1866]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 21:16:28.947857 systemd-logind[1647]: New session 6 of user core. Jan 16 21:16:28.960881 systemd[1]: Started session-6.scope - Session 6 of User core. Jan 16 21:16:29.239420 sshd[1870]: Connection closed by 4.153.228.146 port 36042 Jan 16 21:16:29.240020 sshd-session[1866]: pam_unix(sshd:session): session closed for user core Jan 16 21:16:29.244807 systemd[1]: sshd@4-10.0.3.156:22-4.153.228.146:36042.service: Deactivated successfully. Jan 16 21:16:29.246761 systemd[1]: session-6.scope: Deactivated successfully. Jan 16 21:16:29.248846 systemd-logind[1647]: Session 6 logged out. Waiting for processes to exit. Jan 16 21:16:29.249954 systemd-logind[1647]: Removed session 6. Jan 16 21:16:29.356962 systemd[1]: Started sshd@5-10.0.3.156:22-4.153.228.146:36058.service - OpenSSH per-connection server daemon (4.153.228.146:36058). Jan 16 21:16:29.907615 sshd[1876]: Accepted publickey for core from 4.153.228.146 port 36058 ssh2: RSA SHA256:bS5ZIeZwWVVoAGnX02c/6nLNZbVn3G5kxaKDLP4fvzs Jan 16 21:16:29.908805 sshd-session[1876]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 21:16:29.914477 systemd-logind[1647]: New session 7 of user core. Jan 16 21:16:29.920912 systemd[1]: Started session-7.scope - Session 7 of User core. Jan 16 21:16:30.210847 sshd[1880]: Connection closed by 4.153.228.146 port 36058 Jan 16 21:16:30.212518 sshd-session[1876]: pam_unix(sshd:session): session closed for user core Jan 16 21:16:30.221436 systemd-logind[1647]: Session 7 logged out. Waiting for processes to exit. Jan 16 21:16:30.222096 systemd[1]: sshd@5-10.0.3.156:22-4.153.228.146:36058.service: Deactivated successfully. Jan 16 21:16:30.227059 systemd[1]: session-7.scope: Deactivated successfully. Jan 16 21:16:30.231675 systemd-logind[1647]: Removed session 7. Jan 16 21:16:30.319715 systemd[1]: Started sshd@6-10.0.3.156:22-4.153.228.146:36060.service - OpenSSH per-connection server daemon (4.153.228.146:36060). Jan 16 21:16:30.859640 sshd[1886]: Accepted publickey for core from 4.153.228.146 port 36060 ssh2: RSA SHA256:bS5ZIeZwWVVoAGnX02c/6nLNZbVn3G5kxaKDLP4fvzs Jan 16 21:16:30.860577 sshd-session[1886]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 21:16:30.864540 systemd-logind[1647]: New session 8 of user core. Jan 16 21:16:30.878831 systemd[1]: Started session-8.scope - Session 8 of User core. Jan 16 21:16:31.087357 sudo[1891]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Jan 16 21:16:31.087667 sudo[1891]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 16 21:16:31.096875 sudo[1891]: pam_unix(sudo:session): session closed for user root Jan 16 21:16:31.196662 sshd[1890]: Connection closed by 4.153.228.146 port 36060 Jan 16 21:16:31.197452 sshd-session[1886]: pam_unix(sshd:session): session closed for user core Jan 16 21:16:31.202592 systemd[1]: sshd@6-10.0.3.156:22-4.153.228.146:36060.service: Deactivated successfully. Jan 16 21:16:31.204628 systemd[1]: session-8.scope: Deactivated successfully. Jan 16 21:16:31.206056 systemd-logind[1647]: Session 8 logged out. Waiting for processes to exit. Jan 16 21:16:31.206956 systemd-logind[1647]: Removed session 8. Jan 16 21:16:31.312899 systemd[1]: Started sshd@7-10.0.3.156:22-4.153.228.146:36074.service - OpenSSH per-connection server daemon (4.153.228.146:36074). Jan 16 21:16:31.866642 sshd[1898]: Accepted publickey for core from 4.153.228.146 port 36074 ssh2: RSA SHA256:bS5ZIeZwWVVoAGnX02c/6nLNZbVn3G5kxaKDLP4fvzs Jan 16 21:16:31.867820 sshd-session[1898]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 21:16:31.872408 systemd-logind[1647]: New session 9 of user core. Jan 16 21:16:31.878856 systemd[1]: Started session-9.scope - Session 9 of User core. Jan 16 21:16:32.072374 sudo[1904]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Jan 16 21:16:32.072766 sudo[1904]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 16 21:16:32.081975 sudo[1904]: pam_unix(sudo:session): session closed for user root Jan 16 21:16:32.088018 sudo[1903]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Jan 16 21:16:32.088280 sudo[1903]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 16 21:16:32.096344 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 16 21:16:32.139000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Jan 16 21:16:32.140870 augenrules[1928]: No rules Jan 16 21:16:32.141187 kernel: kauditd_printk_skb: 188 callbacks suppressed Jan 16 21:16:32.141246 kernel: audit: type=1305 audit(1768598192.139:233): auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Jan 16 21:16:32.139000 audit[1928]: SYSCALL arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7ffc78a824c0 a2=420 a3=0 items=0 ppid=1909 pid=1928 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:32.143139 systemd[1]: audit-rules.service: Deactivated successfully. Jan 16 21:16:32.143933 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 16 21:16:32.145627 kernel: audit: type=1300 audit(1768598192.139:233): arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7ffc78a824c0 a2=420 a3=0 items=0 ppid=1909 pid=1928 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:32.139000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 16 21:16:32.146247 sudo[1903]: pam_unix(sudo:session): session closed for user root Jan 16 21:16:32.142000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:32.150625 kernel: audit: type=1327 audit(1768598192.139:233): proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 16 21:16:32.150706 kernel: audit: type=1130 audit(1768598192.142:234): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:32.150737 kernel: audit: type=1131 audit(1768598192.142:235): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:32.142000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:32.145000 audit[1903]: USER_END pid=1903 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 16 21:16:32.155113 kernel: audit: type=1106 audit(1768598192.145:236): pid=1903 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 16 21:16:32.145000 audit[1903]: CRED_DISP pid=1903 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 16 21:16:32.157207 kernel: audit: type=1104 audit(1768598192.145:237): pid=1903 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 16 21:16:32.245736 sshd[1902]: Connection closed by 4.153.228.146 port 36074 Jan 16 21:16:32.246301 sshd-session[1898]: pam_unix(sshd:session): session closed for user core Jan 16 21:16:32.246000 audit[1898]: USER_END pid=1898 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:16:32.251610 kernel: audit: type=1106 audit(1768598192.246:238): pid=1898 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:16:32.252452 systemd[1]: sshd@7-10.0.3.156:22-4.153.228.146:36074.service: Deactivated successfully. Jan 16 21:16:32.246000 audit[1898]: CRED_DISP pid=1898 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:16:32.254477 systemd[1]: session-9.scope: Deactivated successfully. Jan 16 21:16:32.252000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-10.0.3.156:22-4.153.228.146:36074 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:32.257215 kernel: audit: type=1104 audit(1768598192.246:239): pid=1898 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:16:32.257268 kernel: audit: type=1131 audit(1768598192.252:240): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-10.0.3.156:22-4.153.228.146:36074 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:32.257562 systemd-logind[1647]: Session 9 logged out. Waiting for processes to exit. Jan 16 21:16:32.258951 systemd-logind[1647]: Removed session 9. Jan 16 21:16:32.356230 systemd[1]: Started sshd@8-10.0.3.156:22-4.153.228.146:36076.service - OpenSSH per-connection server daemon (4.153.228.146:36076). Jan 16 21:16:32.355000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.0.3.156:22-4.153.228.146:36076 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:32.887000 audit[1937]: USER_ACCT pid=1937 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:16:32.888489 sshd[1937]: Accepted publickey for core from 4.153.228.146 port 36076 ssh2: RSA SHA256:bS5ZIeZwWVVoAGnX02c/6nLNZbVn3G5kxaKDLP4fvzs Jan 16 21:16:32.889000 audit[1937]: CRED_ACQ pid=1937 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:16:32.889000 audit[1937]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff0d378010 a2=3 a3=0 items=0 ppid=1 pid=1937 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=10 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:32.889000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 21:16:32.890207 sshd-session[1937]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 21:16:32.895296 systemd-logind[1647]: New session 10 of user core. Jan 16 21:16:32.900852 systemd[1]: Started session-10.scope - Session 10 of User core. Jan 16 21:16:32.902000 audit[1937]: USER_START pid=1937 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:16:32.904000 audit[1941]: CRED_ACQ pid=1941 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:16:33.089201 sudo[1942]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Jan 16 21:16:33.088000 audit[1942]: USER_ACCT pid=1942 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_unix,pam_faillock acct="core" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 16 21:16:33.088000 audit[1942]: CRED_REFR pid=1942 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 16 21:16:33.089000 audit[1942]: USER_START pid=1942 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 16 21:16:33.089797 sudo[1942]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 16 21:16:33.533440 systemd[1]: Starting docker.service - Docker Application Container Engine... Jan 16 21:16:33.550002 (dockerd)[1962]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Jan 16 21:16:33.895428 dockerd[1962]: time="2026-01-16T21:16:33.895385737Z" level=info msg="Starting up" Jan 16 21:16:33.897022 dockerd[1962]: time="2026-01-16T21:16:33.897002151Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Jan 16 21:16:33.908605 dockerd[1962]: time="2026-01-16T21:16:33.908560893Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Jan 16 21:16:33.930550 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport417988696-merged.mount: Deactivated successfully. Jan 16 21:16:33.964275 dockerd[1962]: time="2026-01-16T21:16:33.964100549Z" level=info msg="Loading containers: start." Jan 16 21:16:33.974631 kernel: Initializing XFRM netlink socket Jan 16 21:16:34.042000 audit[2010]: NETFILTER_CFG table=nat:2 family=2 entries=2 op=nft_register_chain pid=2010 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 21:16:34.042000 audit[2010]: SYSCALL arch=c000003e syscall=46 success=yes exit=116 a0=3 a1=7ffc9ec03ce0 a2=0 a3=0 items=0 ppid=1962 pid=2010 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:34.042000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Jan 16 21:16:34.044000 audit[2012]: NETFILTER_CFG table=filter:3 family=2 entries=2 op=nft_register_chain pid=2012 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 21:16:34.044000 audit[2012]: SYSCALL arch=c000003e syscall=46 success=yes exit=124 a0=3 a1=7ffffc7f5fb0 a2=0 a3=0 items=0 ppid=1962 pid=2012 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:34.044000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Jan 16 21:16:34.045000 audit[2014]: NETFILTER_CFG table=filter:4 family=2 entries=1 op=nft_register_chain pid=2014 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 21:16:34.045000 audit[2014]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc6c6ea580 a2=0 a3=0 items=0 ppid=1962 pid=2014 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:34.045000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Jan 16 21:16:34.047000 audit[2016]: NETFILTER_CFG table=filter:5 family=2 entries=1 op=nft_register_chain pid=2016 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 21:16:34.047000 audit[2016]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffdcc9992c0 a2=0 a3=0 items=0 ppid=1962 pid=2016 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:34.047000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Jan 16 21:16:34.049000 audit[2018]: NETFILTER_CFG table=filter:6 family=2 entries=1 op=nft_register_chain pid=2018 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 21:16:34.049000 audit[2018]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffe4c1a2dc0 a2=0 a3=0 items=0 ppid=1962 pid=2018 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:34.049000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Jan 16 21:16:34.051000 audit[2020]: NETFILTER_CFG table=filter:7 family=2 entries=1 op=nft_register_chain pid=2020 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 21:16:34.051000 audit[2020]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7ffc62637990 a2=0 a3=0 items=0 ppid=1962 pid=2020 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:34.051000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 16 21:16:34.053000 audit[2022]: NETFILTER_CFG table=filter:8 family=2 entries=1 op=nft_register_chain pid=2022 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 21:16:34.053000 audit[2022]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7ffcb3bc5610 a2=0 a3=0 items=0 ppid=1962 pid=2022 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:34.053000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 16 21:16:34.055000 audit[2024]: NETFILTER_CFG table=nat:9 family=2 entries=2 op=nft_register_chain pid=2024 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 21:16:34.055000 audit[2024]: SYSCALL arch=c000003e syscall=46 success=yes exit=384 a0=3 a1=7ffeaeb4d0a0 a2=0 a3=0 items=0 ppid=1962 pid=2024 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:34.055000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Jan 16 21:16:34.089000 audit[2027]: NETFILTER_CFG table=nat:10 family=2 entries=2 op=nft_register_chain pid=2027 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 21:16:34.089000 audit[2027]: SYSCALL arch=c000003e syscall=46 success=yes exit=472 a0=3 a1=7ffc38dc2850 a2=0 a3=0 items=0 ppid=1962 pid=2027 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:34.089000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003132372E302E302E302F38 Jan 16 21:16:34.091000 audit[2029]: NETFILTER_CFG table=filter:11 family=2 entries=2 op=nft_register_chain pid=2029 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 21:16:34.091000 audit[2029]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7ffc15a75530 a2=0 a3=0 items=0 ppid=1962 pid=2029 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:34.091000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Jan 16 21:16:34.093000 audit[2031]: NETFILTER_CFG table=filter:12 family=2 entries=1 op=nft_register_rule pid=2031 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 21:16:34.093000 audit[2031]: SYSCALL arch=c000003e syscall=46 success=yes exit=236 a0=3 a1=7ffdab6b32d0 a2=0 a3=0 items=0 ppid=1962 pid=2031 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:34.093000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Jan 16 21:16:34.096000 audit[2033]: NETFILTER_CFG table=filter:13 family=2 entries=1 op=nft_register_rule pid=2033 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 21:16:34.096000 audit[2033]: SYSCALL arch=c000003e syscall=46 success=yes exit=248 a0=3 a1=7ffff4138f90 a2=0 a3=0 items=0 ppid=1962 pid=2033 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:34.096000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 16 21:16:34.098000 audit[2035]: NETFILTER_CFG table=filter:14 family=2 entries=1 op=nft_register_rule pid=2035 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 21:16:34.098000 audit[2035]: SYSCALL arch=c000003e syscall=46 success=yes exit=232 a0=3 a1=7ffc9afcccc0 a2=0 a3=0 items=0 ppid=1962 pid=2035 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:34.098000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Jan 16 21:16:34.136000 audit[2065]: NETFILTER_CFG table=nat:15 family=10 entries=2 op=nft_register_chain pid=2065 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 21:16:34.136000 audit[2065]: SYSCALL arch=c000003e syscall=46 success=yes exit=116 a0=3 a1=7ffdd1f514f0 a2=0 a3=0 items=0 ppid=1962 pid=2065 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:34.136000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Jan 16 21:16:34.138000 audit[2067]: NETFILTER_CFG table=filter:16 family=10 entries=2 op=nft_register_chain pid=2067 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 21:16:34.138000 audit[2067]: SYSCALL arch=c000003e syscall=46 success=yes exit=124 a0=3 a1=7ffd2b6c39b0 a2=0 a3=0 items=0 ppid=1962 pid=2067 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:34.138000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Jan 16 21:16:34.139000 audit[2069]: NETFILTER_CFG table=filter:17 family=10 entries=1 op=nft_register_chain pid=2069 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 21:16:34.139000 audit[2069]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff9bf6ffe0 a2=0 a3=0 items=0 ppid=1962 pid=2069 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:34.139000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Jan 16 21:16:34.141000 audit[2071]: NETFILTER_CFG table=filter:18 family=10 entries=1 op=nft_register_chain pid=2071 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 21:16:34.141000 audit[2071]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffead48da30 a2=0 a3=0 items=0 ppid=1962 pid=2071 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:34.141000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Jan 16 21:16:34.142000 audit[2073]: NETFILTER_CFG table=filter:19 family=10 entries=1 op=nft_register_chain pid=2073 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 21:16:34.142000 audit[2073]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7fffd7b0cad0 a2=0 a3=0 items=0 ppid=1962 pid=2073 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:34.142000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Jan 16 21:16:34.144000 audit[2075]: NETFILTER_CFG table=filter:20 family=10 entries=1 op=nft_register_chain pid=2075 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 21:16:34.144000 audit[2075]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7ffff4c48510 a2=0 a3=0 items=0 ppid=1962 pid=2075 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:34.144000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 16 21:16:34.146000 audit[2077]: NETFILTER_CFG table=filter:21 family=10 entries=1 op=nft_register_chain pid=2077 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 21:16:34.146000 audit[2077]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7ffdaa5926f0 a2=0 a3=0 items=0 ppid=1962 pid=2077 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:34.146000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 16 21:16:34.148000 audit[2079]: NETFILTER_CFG table=nat:22 family=10 entries=2 op=nft_register_chain pid=2079 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 21:16:34.148000 audit[2079]: SYSCALL arch=c000003e syscall=46 success=yes exit=384 a0=3 a1=7fffef763d90 a2=0 a3=0 items=0 ppid=1962 pid=2079 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:34.148000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Jan 16 21:16:34.151000 audit[2081]: NETFILTER_CFG table=nat:23 family=10 entries=2 op=nft_register_chain pid=2081 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 21:16:34.151000 audit[2081]: SYSCALL arch=c000003e syscall=46 success=yes exit=484 a0=3 a1=7ffd6dd8dd00 a2=0 a3=0 items=0 ppid=1962 pid=2081 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:34.151000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003A3A312F313238 Jan 16 21:16:34.152000 audit[2083]: NETFILTER_CFG table=filter:24 family=10 entries=2 op=nft_register_chain pid=2083 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 21:16:34.152000 audit[2083]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7fff932e06c0 a2=0 a3=0 items=0 ppid=1962 pid=2083 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:34.152000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Jan 16 21:16:34.154000 audit[2085]: NETFILTER_CFG table=filter:25 family=10 entries=1 op=nft_register_rule pid=2085 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 21:16:34.154000 audit[2085]: SYSCALL arch=c000003e syscall=46 success=yes exit=236 a0=3 a1=7ffd5776ece0 a2=0 a3=0 items=0 ppid=1962 pid=2085 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:34.154000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Jan 16 21:16:34.156000 audit[2087]: NETFILTER_CFG table=filter:26 family=10 entries=1 op=nft_register_rule pid=2087 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 21:16:34.156000 audit[2087]: SYSCALL arch=c000003e syscall=46 success=yes exit=248 a0=3 a1=7ffe815a1990 a2=0 a3=0 items=0 ppid=1962 pid=2087 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:34.156000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 16 21:16:34.158000 audit[2089]: NETFILTER_CFG table=filter:27 family=10 entries=1 op=nft_register_rule pid=2089 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 21:16:34.158000 audit[2089]: SYSCALL arch=c000003e syscall=46 success=yes exit=232 a0=3 a1=7ffcc25230e0 a2=0 a3=0 items=0 ppid=1962 pid=2089 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:34.158000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Jan 16 21:16:34.162000 audit[2094]: NETFILTER_CFG table=filter:28 family=2 entries=1 op=nft_register_chain pid=2094 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 21:16:34.162000 audit[2094]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7fff4f8581c0 a2=0 a3=0 items=0 ppid=1962 pid=2094 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:34.162000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Jan 16 21:16:34.164000 audit[2096]: NETFILTER_CFG table=filter:29 family=2 entries=1 op=nft_register_rule pid=2096 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 21:16:34.164000 audit[2096]: SYSCALL arch=c000003e syscall=46 success=yes exit=212 a0=3 a1=7ffd0dd38680 a2=0 a3=0 items=0 ppid=1962 pid=2096 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:34.164000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Jan 16 21:16:34.166000 audit[2098]: NETFILTER_CFG table=filter:30 family=2 entries=1 op=nft_register_rule pid=2098 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 21:16:34.166000 audit[2098]: SYSCALL arch=c000003e syscall=46 success=yes exit=224 a0=3 a1=7ffcd6aa1a30 a2=0 a3=0 items=0 ppid=1962 pid=2098 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:34.166000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Jan 16 21:16:34.168000 audit[2100]: NETFILTER_CFG table=filter:31 family=10 entries=1 op=nft_register_chain pid=2100 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 21:16:34.168000 audit[2100]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffd4ae40460 a2=0 a3=0 items=0 ppid=1962 pid=2100 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:34.168000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Jan 16 21:16:34.170000 audit[2102]: NETFILTER_CFG table=filter:32 family=10 entries=1 op=nft_register_rule pid=2102 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 21:16:34.170000 audit[2102]: SYSCALL arch=c000003e syscall=46 success=yes exit=212 a0=3 a1=7ffe251ef0c0 a2=0 a3=0 items=0 ppid=1962 pid=2102 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:34.170000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Jan 16 21:16:34.172000 audit[2104]: NETFILTER_CFG table=filter:33 family=10 entries=1 op=nft_register_rule pid=2104 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 21:16:34.172000 audit[2104]: SYSCALL arch=c000003e syscall=46 success=yes exit=224 a0=3 a1=7ffc18efb670 a2=0 a3=0 items=0 ppid=1962 pid=2104 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:34.172000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Jan 16 21:16:34.199000 audit[2109]: NETFILTER_CFG table=nat:34 family=2 entries=2 op=nft_register_chain pid=2109 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 21:16:34.199000 audit[2109]: SYSCALL arch=c000003e syscall=46 success=yes exit=520 a0=3 a1=7ffe102d31b0 a2=0 a3=0 items=0 ppid=1962 pid=2109 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:34.199000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900504F5354524F5554494E47002D73003137322E31372E302E302F31360000002D6F00646F636B657230002D6A004D415351554552414445 Jan 16 21:16:34.201000 audit[2111]: NETFILTER_CFG table=nat:35 family=2 entries=1 op=nft_register_rule pid=2111 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 21:16:34.201000 audit[2111]: SYSCALL arch=c000003e syscall=46 success=yes exit=288 a0=3 a1=7fff206a2680 a2=0 a3=0 items=0 ppid=1962 pid=2111 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:34.201000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900444F434B4552002D6900646F636B657230002D6A0052455455524E Jan 16 21:16:34.209000 audit[2119]: NETFILTER_CFG table=filter:36 family=2 entries=1 op=nft_register_rule pid=2119 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 21:16:34.209000 audit[2119]: SYSCALL arch=c000003e syscall=46 success=yes exit=300 a0=3 a1=7ffff821ed80 a2=0 a3=0 items=0 ppid=1962 pid=2119 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:34.209000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D464F5257415244002D6900646F636B657230002D6A00414343455054 Jan 16 21:16:34.220000 audit[2125]: NETFILTER_CFG table=filter:37 family=2 entries=1 op=nft_register_rule pid=2125 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 21:16:34.220000 audit[2125]: SYSCALL arch=c000003e syscall=46 success=yes exit=376 a0=3 a1=7fff2b4f7c10 a2=0 a3=0 items=0 ppid=1962 pid=2125 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:34.220000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45520000002D6900646F636B657230002D6F00646F636B657230002D6A0044524F50 Jan 16 21:16:34.223000 audit[2127]: NETFILTER_CFG table=filter:38 family=2 entries=1 op=nft_register_rule pid=2127 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 21:16:34.223000 audit[2127]: SYSCALL arch=c000003e syscall=46 success=yes exit=512 a0=3 a1=7fff123eee60 a2=0 a3=0 items=0 ppid=1962 pid=2127 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:34.223000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D4354002D6F00646F636B657230002D6D00636F6E6E747261636B002D2D637473746174650052454C415445442C45535441424C4953484544002D6A00414343455054 Jan 16 21:16:34.225000 audit[2129]: NETFILTER_CFG table=filter:39 family=2 entries=1 op=nft_register_rule pid=2129 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 21:16:34.225000 audit[2129]: SYSCALL arch=c000003e syscall=46 success=yes exit=312 a0=3 a1=7fffbdf74610 a2=0 a3=0 items=0 ppid=1962 pid=2129 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:34.225000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D425249444745002D6F00646F636B657230002D6A00444F434B4552 Jan 16 21:16:34.227000 audit[2131]: NETFILTER_CFG table=filter:40 family=2 entries=1 op=nft_register_rule pid=2131 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 21:16:34.227000 audit[2131]: SYSCALL arch=c000003e syscall=46 success=yes exit=428 a0=3 a1=7ffc84c226a0 a2=0 a3=0 items=0 ppid=1962 pid=2131 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:34.227000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D49534F4C4154494F4E2D53544147452D31002D6900646F636B6572300000002D6F00646F636B657230002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 16 21:16:34.229000 audit[2133]: NETFILTER_CFG table=filter:41 family=2 entries=1 op=nft_register_rule pid=2133 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 21:16:34.229000 audit[2133]: SYSCALL arch=c000003e syscall=46 success=yes exit=312 a0=3 a1=7fffe8abc4d0 a2=0 a3=0 items=0 ppid=1962 pid=2133 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:34.229000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4900444F434B45522D49534F4C4154494F4E2D53544147452D32002D6F00646F636B657230002D6A0044524F50 Jan 16 21:16:34.230742 systemd-networkd[1571]: docker0: Link UP Jan 16 21:16:34.237537 dockerd[1962]: time="2026-01-16T21:16:34.237486728Z" level=info msg="Loading containers: done." Jan 16 21:16:34.263465 dockerd[1962]: time="2026-01-16T21:16:34.263388036Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Jan 16 21:16:34.263465 dockerd[1962]: time="2026-01-16T21:16:34.263476943Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Jan 16 21:16:34.263737 dockerd[1962]: time="2026-01-16T21:16:34.263562284Z" level=info msg="Initializing buildkit" Jan 16 21:16:34.314947 dockerd[1962]: time="2026-01-16T21:16:34.314859698Z" level=info msg="Completed buildkit initialization" Jan 16 21:16:34.321768 dockerd[1962]: time="2026-01-16T21:16:34.321294812Z" level=info msg="Daemon has completed initialization" Jan 16 21:16:34.321664 systemd[1]: Started docker.service - Docker Application Container Engine. Jan 16 21:16:34.321000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=docker comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:34.322904 dockerd[1962]: time="2026-01-16T21:16:34.322491748Z" level=info msg="API listen on /run/docker.sock" Jan 16 21:16:36.001587 containerd[1678]: time="2026-01-16T21:16:36.001545954Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.11\"" Jan 16 21:16:36.893274 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Jan 16 21:16:36.897354 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 16 21:16:36.920575 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3384381586.mount: Deactivated successfully. Jan 16 21:16:37.071731 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 16 21:16:37.071000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:37.082081 (kubelet)[2194]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 16 21:16:37.121646 chronyd[1630]: Selected source PHC0 Jan 16 21:16:37.135329 kubelet[2194]: E0116 21:16:37.135265 2194 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 16 21:16:37.138548 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 16 21:16:37.138743 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 16 21:16:37.138000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 16 21:16:37.139410 systemd[1]: kubelet.service: Consumed 148ms CPU time, 110.7M memory peak. Jan 16 21:16:37.916560 containerd[1678]: time="2026-01-16T21:16:37.915629499Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 21:16:37.917224 containerd[1678]: time="2026-01-16T21:16:37.917203592Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.11: active requests=0, bytes read=27401903" Jan 16 21:16:37.919922 containerd[1678]: time="2026-01-16T21:16:37.919890669Z" level=info msg="ImageCreate event name:\"sha256:7757c58248a29fc7474a8072796848689852b0477adf16765f38b3d1a9bacadf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 21:16:37.923676 containerd[1678]: time="2026-01-16T21:16:37.923630299Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:41eaecaed9af0ca8ab36d7794819c7df199e68c6c6ee0649114d713c495f8bd5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 21:16:37.924481 containerd[1678]: time="2026-01-16T21:16:37.924453457Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.11\" with image id \"sha256:7757c58248a29fc7474a8072796848689852b0477adf16765f38b3d1a9bacadf\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.11\", repo digest \"registry.k8s.io/kube-apiserver@sha256:41eaecaed9af0ca8ab36d7794819c7df199e68c6c6ee0649114d713c495f8bd5\", size \"29067246\" in 1.922489995s" Jan 16 21:16:37.924557 containerd[1678]: time="2026-01-16T21:16:37.924545687Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.11\" returns image reference \"sha256:7757c58248a29fc7474a8072796848689852b0477adf16765f38b3d1a9bacadf\"" Jan 16 21:16:37.925243 containerd[1678]: time="2026-01-16T21:16:37.925210520Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.11\"" Jan 16 21:16:39.488967 containerd[1678]: time="2026-01-16T21:16:39.488880333Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 21:16:39.491827 containerd[1678]: time="2026-01-16T21:16:39.491549713Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.11: active requests=0, bytes read=24985199" Jan 16 21:16:39.493657 containerd[1678]: time="2026-01-16T21:16:39.493592510Z" level=info msg="ImageCreate event name:\"sha256:0175d0a8243db520e3caa6d5c1e4248fddbc32447a9e8b5f4630831bc1e2489e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 21:16:39.496836 containerd[1678]: time="2026-01-16T21:16:39.496796859Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:ce7b2ead5eef1a1554ef28b2b79596c6a8c6d506a87a7ab1381e77fe3d72f55f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 21:16:39.497616 containerd[1678]: time="2026-01-16T21:16:39.497589452Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.11\" with image id \"sha256:0175d0a8243db520e3caa6d5c1e4248fddbc32447a9e8b5f4630831bc1e2489e\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.11\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:ce7b2ead5eef1a1554ef28b2b79596c6a8c6d506a87a7ab1381e77fe3d72f55f\", size \"26650388\" in 1.5722465s" Jan 16 21:16:39.497714 containerd[1678]: time="2026-01-16T21:16:39.497700843Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.11\" returns image reference \"sha256:0175d0a8243db520e3caa6d5c1e4248fddbc32447a9e8b5f4630831bc1e2489e\"" Jan 16 21:16:39.498442 containerd[1678]: time="2026-01-16T21:16:39.498312141Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.11\"" Jan 16 21:16:40.789059 containerd[1678]: time="2026-01-16T21:16:40.788236899Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 21:16:40.789453 containerd[1678]: time="2026-01-16T21:16:40.789412615Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.11: active requests=0, bytes read=19396939" Jan 16 21:16:40.791179 containerd[1678]: time="2026-01-16T21:16:40.791161028Z" level=info msg="ImageCreate event name:\"sha256:23d6a1fb92fda53b787f364351c610e55f073e8bdf0de5831974df7875b13f21\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 21:16:40.794057 containerd[1678]: time="2026-01-16T21:16:40.794037428Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:b3039587bbe70e61a6aeaff56c21fdeeef104524a31f835bcc80887d40b8e6b2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 21:16:40.794875 containerd[1678]: time="2026-01-16T21:16:40.794855995Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.11\" with image id \"sha256:23d6a1fb92fda53b787f364351c610e55f073e8bdf0de5831974df7875b13f21\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.11\", repo digest \"registry.k8s.io/kube-scheduler@sha256:b3039587bbe70e61a6aeaff56c21fdeeef104524a31f835bcc80887d40b8e6b2\", size \"21062128\" in 1.296514359s" Jan 16 21:16:40.794956 containerd[1678]: time="2026-01-16T21:16:40.794945506Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.11\" returns image reference \"sha256:23d6a1fb92fda53b787f364351c610e55f073e8bdf0de5831974df7875b13f21\"" Jan 16 21:16:40.795841 containerd[1678]: time="2026-01-16T21:16:40.795803840Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.11\"" Jan 16 21:16:41.945388 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1060428040.mount: Deactivated successfully. Jan 16 21:16:42.312859 containerd[1678]: time="2026-01-16T21:16:42.312676419Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 21:16:42.315272 containerd[1678]: time="2026-01-16T21:16:42.315234284Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.11: active requests=0, bytes read=31158177" Jan 16 21:16:42.317070 containerd[1678]: time="2026-01-16T21:16:42.317035503Z" level=info msg="ImageCreate event name:\"sha256:4d8fb2dc5751966f058943ff7c5f10551e603d726ab8648c7c7b7f95a2663e3d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 21:16:42.320395 containerd[1678]: time="2026-01-16T21:16:42.320343312Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:4204f9136c23a867929d32046032fe069b49ad94cf168042405e7d0ec88bdba9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 21:16:42.321033 containerd[1678]: time="2026-01-16T21:16:42.320699434Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.11\" with image id \"sha256:4d8fb2dc5751966f058943ff7c5f10551e603d726ab8648c7c7b7f95a2663e3d\", repo tag \"registry.k8s.io/kube-proxy:v1.32.11\", repo digest \"registry.k8s.io/kube-proxy@sha256:4204f9136c23a867929d32046032fe069b49ad94cf168042405e7d0ec88bdba9\", size \"31160918\" in 1.52479289s" Jan 16 21:16:42.321033 containerd[1678]: time="2026-01-16T21:16:42.320730790Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.11\" returns image reference \"sha256:4d8fb2dc5751966f058943ff7c5f10551e603d726ab8648c7c7b7f95a2663e3d\"" Jan 16 21:16:42.321238 containerd[1678]: time="2026-01-16T21:16:42.321193660Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Jan 16 21:16:43.019635 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount309599244.mount: Deactivated successfully. Jan 16 21:16:43.779128 containerd[1678]: time="2026-01-16T21:16:43.779044588Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 21:16:43.781174 containerd[1678]: time="2026-01-16T21:16:43.781117210Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=17569975" Jan 16 21:16:43.783607 containerd[1678]: time="2026-01-16T21:16:43.783544501Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 21:16:43.787863 containerd[1678]: time="2026-01-16T21:16:43.787788844Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 21:16:43.788633 containerd[1678]: time="2026-01-16T21:16:43.788273043Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.467052426s" Jan 16 21:16:43.788633 containerd[1678]: time="2026-01-16T21:16:43.788311684Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Jan 16 21:16:43.788862 containerd[1678]: time="2026-01-16T21:16:43.788812830Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Jan 16 21:16:44.437708 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount50982793.mount: Deactivated successfully. Jan 16 21:16:44.456660 containerd[1678]: time="2026-01-16T21:16:44.456195083Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 16 21:16:44.460071 containerd[1678]: time="2026-01-16T21:16:44.460043587Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Jan 16 21:16:44.464130 containerd[1678]: time="2026-01-16T21:16:44.464097359Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 16 21:16:44.467021 containerd[1678]: time="2026-01-16T21:16:44.466962919Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 16 21:16:44.468334 containerd[1678]: time="2026-01-16T21:16:44.467562380Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 678.717763ms" Jan 16 21:16:44.468334 containerd[1678]: time="2026-01-16T21:16:44.467608266Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Jan 16 21:16:44.468663 containerd[1678]: time="2026-01-16T21:16:44.468644094Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Jan 16 21:16:45.284354 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1685051322.mount: Deactivated successfully. Jan 16 21:16:46.783918 containerd[1678]: time="2026-01-16T21:16:46.783841642Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 21:16:46.788468 containerd[1678]: time="2026-01-16T21:16:46.788414742Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=45502580" Jan 16 21:16:46.792214 containerd[1678]: time="2026-01-16T21:16:46.792156384Z" level=info msg="ImageCreate event name:\"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 21:16:46.797718 containerd[1678]: time="2026-01-16T21:16:46.797673936Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 21:16:46.799413 containerd[1678]: time="2026-01-16T21:16:46.799362598Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"57680541\" in 2.330624701s" Jan 16 21:16:46.799413 containerd[1678]: time="2026-01-16T21:16:46.799394104Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\"" Jan 16 21:16:47.144642 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Jan 16 21:16:47.146500 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 16 21:16:47.289394 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 16 21:16:47.293710 kernel: kauditd_printk_skb: 134 callbacks suppressed Jan 16 21:16:47.293795 kernel: audit: type=1130 audit(1768598207.288:293): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:47.288000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:47.294227 (kubelet)[2389]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 16 21:16:47.334619 kubelet[2389]: E0116 21:16:47.334526 2389 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 16 21:16:47.336830 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 16 21:16:47.337058 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 16 21:16:47.337667 systemd[1]: kubelet.service: Consumed 141ms CPU time, 110.1M memory peak. Jan 16 21:16:47.335000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 16 21:16:47.341874 kernel: audit: type=1131 audit(1768598207.335:294): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 16 21:16:49.582593 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 16 21:16:49.583099 systemd[1]: kubelet.service: Consumed 141ms CPU time, 110.1M memory peak. Jan 16 21:16:49.581000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:49.581000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:49.586638 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 16 21:16:49.587650 kernel: audit: type=1130 audit(1768598209.581:295): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:49.587688 kernel: audit: type=1131 audit(1768598209.581:296): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:49.623659 systemd[1]: Reload requested from client PID 2416 ('systemctl') (unit session-10.scope)... Jan 16 21:16:49.623676 systemd[1]: Reloading... Jan 16 21:16:49.725616 zram_generator::config[2459]: No configuration found. Jan 16 21:16:49.924315 systemd[1]: Reloading finished in 300 ms. Jan 16 21:16:49.951000 audit: BPF prog-id=63 op=LOAD Jan 16 21:16:49.954609 kernel: audit: type=1334 audit(1768598209.951:297): prog-id=63 op=LOAD Jan 16 21:16:49.953000 audit: BPF prog-id=51 op=UNLOAD Jan 16 21:16:49.957625 kernel: audit: type=1334 audit(1768598209.953:298): prog-id=51 op=UNLOAD Jan 16 21:16:49.953000 audit: BPF prog-id=64 op=LOAD Jan 16 21:16:49.961677 kernel: audit: type=1334 audit(1768598209.953:299): prog-id=64 op=LOAD Jan 16 21:16:49.953000 audit: BPF prog-id=65 op=LOAD Jan 16 21:16:49.964782 kernel: audit: type=1334 audit(1768598209.953:300): prog-id=65 op=LOAD Jan 16 21:16:49.964842 kernel: audit: type=1334 audit(1768598209.953:301): prog-id=52 op=UNLOAD Jan 16 21:16:49.953000 audit: BPF prog-id=52 op=UNLOAD Jan 16 21:16:49.965936 kernel: audit: type=1334 audit(1768598209.953:302): prog-id=53 op=UNLOAD Jan 16 21:16:49.953000 audit: BPF prog-id=53 op=UNLOAD Jan 16 21:16:49.956000 audit: BPF prog-id=66 op=LOAD Jan 16 21:16:49.957000 audit: BPF prog-id=48 op=UNLOAD Jan 16 21:16:49.957000 audit: BPF prog-id=67 op=LOAD Jan 16 21:16:49.957000 audit: BPF prog-id=68 op=LOAD Jan 16 21:16:49.957000 audit: BPF prog-id=49 op=UNLOAD Jan 16 21:16:49.957000 audit: BPF prog-id=50 op=UNLOAD Jan 16 21:16:49.957000 audit: BPF prog-id=69 op=LOAD Jan 16 21:16:49.957000 audit: BPF prog-id=43 op=UNLOAD Jan 16 21:16:49.957000 audit: BPF prog-id=70 op=LOAD Jan 16 21:16:49.958000 audit: BPF prog-id=71 op=LOAD Jan 16 21:16:49.958000 audit: BPF prog-id=44 op=UNLOAD Jan 16 21:16:49.958000 audit: BPF prog-id=45 op=UNLOAD Jan 16 21:16:49.958000 audit: BPF prog-id=72 op=LOAD Jan 16 21:16:49.958000 audit: BPF prog-id=57 op=UNLOAD Jan 16 21:16:49.959000 audit: BPF prog-id=73 op=LOAD Jan 16 21:16:49.959000 audit: BPF prog-id=54 op=UNLOAD Jan 16 21:16:49.959000 audit: BPF prog-id=74 op=LOAD Jan 16 21:16:49.959000 audit: BPF prog-id=75 op=LOAD Jan 16 21:16:49.959000 audit: BPF prog-id=55 op=UNLOAD Jan 16 21:16:49.959000 audit: BPF prog-id=56 op=UNLOAD Jan 16 21:16:49.960000 audit: BPF prog-id=76 op=LOAD Jan 16 21:16:49.961000 audit: BPF prog-id=59 op=UNLOAD Jan 16 21:16:49.961000 audit: BPF prog-id=77 op=LOAD Jan 16 21:16:49.961000 audit: BPF prog-id=58 op=UNLOAD Jan 16 21:16:49.962000 audit: BPF prog-id=78 op=LOAD Jan 16 21:16:49.962000 audit: BPF prog-id=79 op=LOAD Jan 16 21:16:49.962000 audit: BPF prog-id=46 op=UNLOAD Jan 16 21:16:49.962000 audit: BPF prog-id=47 op=UNLOAD Jan 16 21:16:49.963000 audit: BPF prog-id=80 op=LOAD Jan 16 21:16:49.963000 audit: BPF prog-id=60 op=UNLOAD Jan 16 21:16:49.963000 audit: BPF prog-id=81 op=LOAD Jan 16 21:16:49.963000 audit: BPF prog-id=82 op=LOAD Jan 16 21:16:49.963000 audit: BPF prog-id=61 op=UNLOAD Jan 16 21:16:49.963000 audit: BPF prog-id=62 op=UNLOAD Jan 16 21:16:49.979150 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Jan 16 21:16:49.979227 systemd[1]: kubelet.service: Failed with result 'signal'. Jan 16 21:16:49.979567 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 16 21:16:49.979653 systemd[1]: kubelet.service: Consumed 99ms CPU time, 98.4M memory peak. Jan 16 21:16:49.978000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 16 21:16:49.981329 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 16 21:16:50.110220 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 16 21:16:50.109000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:50.118040 (kubelet)[2516]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 16 21:16:50.163839 kubelet[2516]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 16 21:16:50.164618 kubelet[2516]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jan 16 21:16:50.164618 kubelet[2516]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 16 21:16:50.164618 kubelet[2516]: I0116 21:16:50.164206 2516 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 16 21:16:50.471371 kubelet[2516]: I0116 21:16:50.471328 2516 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Jan 16 21:16:50.471371 kubelet[2516]: I0116 21:16:50.471362 2516 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 16 21:16:50.471865 kubelet[2516]: I0116 21:16:50.471853 2516 server.go:954] "Client rotation is on, will bootstrap in background" Jan 16 21:16:50.523769 kubelet[2516]: E0116 21:16:50.523737 2516 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.0.3.156:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.3.156:6443: connect: connection refused" logger="UnhandledError" Jan 16 21:16:50.524066 kubelet[2516]: I0116 21:16:50.524045 2516 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 16 21:16:50.531571 kubelet[2516]: I0116 21:16:50.531554 2516 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 16 21:16:50.535617 kubelet[2516]: I0116 21:16:50.535564 2516 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 16 21:16:50.536327 kubelet[2516]: I0116 21:16:50.535958 2516 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 16 21:16:50.536327 kubelet[2516]: I0116 21:16:50.535987 2516 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4580-0-0-p-be73a47b79","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 16 21:16:50.536327 kubelet[2516]: I0116 21:16:50.536159 2516 topology_manager.go:138] "Creating topology manager with none policy" Jan 16 21:16:50.536327 kubelet[2516]: I0116 21:16:50.536167 2516 container_manager_linux.go:304] "Creating device plugin manager" Jan 16 21:16:50.536496 kubelet[2516]: I0116 21:16:50.536292 2516 state_mem.go:36] "Initialized new in-memory state store" Jan 16 21:16:50.541642 kubelet[2516]: I0116 21:16:50.541621 2516 kubelet.go:446] "Attempting to sync node with API server" Jan 16 21:16:50.541749 kubelet[2516]: I0116 21:16:50.541741 2516 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 16 21:16:50.541804 kubelet[2516]: I0116 21:16:50.541799 2516 kubelet.go:352] "Adding apiserver pod source" Jan 16 21:16:50.541843 kubelet[2516]: I0116 21:16:50.541838 2516 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 16 21:16:50.549180 kubelet[2516]: W0116 21:16:50.549135 2516 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.3.156:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4580-0-0-p-be73a47b79&limit=500&resourceVersion=0": dial tcp 10.0.3.156:6443: connect: connection refused Jan 16 21:16:50.549264 kubelet[2516]: E0116 21:16:50.549191 2516 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.3.156:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4580-0-0-p-be73a47b79&limit=500&resourceVersion=0\": dial tcp 10.0.3.156:6443: connect: connection refused" logger="UnhandledError" Jan 16 21:16:50.549519 kubelet[2516]: W0116 21:16:50.549489 2516 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.3.156:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.3.156:6443: connect: connection refused Jan 16 21:16:50.549565 kubelet[2516]: E0116 21:16:50.549525 2516 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.3.156:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.3.156:6443: connect: connection refused" logger="UnhandledError" Jan 16 21:16:50.549634 kubelet[2516]: I0116 21:16:50.549621 2516 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Jan 16 21:16:50.549996 kubelet[2516]: I0116 21:16:50.549977 2516 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 16 21:16:50.553619 kubelet[2516]: W0116 21:16:50.552107 2516 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Jan 16 21:16:50.554540 kubelet[2516]: I0116 21:16:50.554509 2516 watchdog_linux.go:99] "Systemd watchdog is not enabled" Jan 16 21:16:50.554579 kubelet[2516]: I0116 21:16:50.554559 2516 server.go:1287] "Started kubelet" Jan 16 21:16:50.558251 kubelet[2516]: I0116 21:16:50.558101 2516 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 16 21:16:50.558251 kubelet[2516]: I0116 21:16:50.558094 2516 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Jan 16 21:16:50.560111 kubelet[2516]: I0116 21:16:50.560090 2516 server.go:479] "Adding debug handlers to kubelet server" Jan 16 21:16:50.561885 kubelet[2516]: I0116 21:16:50.561831 2516 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 16 21:16:50.562654 kubelet[2516]: I0116 21:16:50.562030 2516 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 16 21:16:50.562654 kubelet[2516]: I0116 21:16:50.562419 2516 volume_manager.go:297] "Starting Kubelet Volume Manager" Jan 16 21:16:50.563240 kubelet[2516]: I0116 21:16:50.563195 2516 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 16 21:16:50.565943 kubelet[2516]: I0116 21:16:50.565897 2516 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Jan 16 21:16:50.566060 kubelet[2516]: I0116 21:16:50.565960 2516 reconciler.go:26] "Reconciler: start to sync state" Jan 16 21:16:50.571568 kubelet[2516]: E0116 21:16:50.571172 2516 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4580-0-0-p-be73a47b79\" not found" Jan 16 21:16:50.572904 kubelet[2516]: E0116 21:16:50.572878 2516 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.3.156:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4580-0-0-p-be73a47b79?timeout=10s\": dial tcp 10.0.3.156:6443: connect: connection refused" interval="200ms" Jan 16 21:16:50.575150 kubelet[2516]: E0116 21:16:50.572931 2516 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.3.156:6443/api/v1/namespaces/default/events\": dial tcp 10.0.3.156:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4580-0-0-p-be73a47b79.188b52b3ba7e840f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4580-0-0-p-be73a47b79,UID:ci-4580-0-0-p-be73a47b79,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4580-0-0-p-be73a47b79,},FirstTimestamp:2026-01-16 21:16:50.554528783 +0000 UTC m=+0.433148023,LastTimestamp:2026-01-16 21:16:50.554528783 +0000 UTC m=+0.433148023,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4580-0-0-p-be73a47b79,}" Jan 16 21:16:50.575273 kubelet[2516]: W0116 21:16:50.575224 2516 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.3.156:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.3.156:6443: connect: connection refused Jan 16 21:16:50.575303 kubelet[2516]: E0116 21:16:50.575273 2516 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.3.156:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.3.156:6443: connect: connection refused" logger="UnhandledError" Jan 16 21:16:50.573000 audit[2527]: NETFILTER_CFG table=mangle:42 family=2 entries=2 op=nft_register_chain pid=2527 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 21:16:50.573000 audit[2527]: SYSCALL arch=c000003e syscall=46 success=yes exit=136 a0=3 a1=7fff7da21cc0 a2=0 a3=0 items=0 ppid=2516 pid=2527 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:50.573000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Jan 16 21:16:50.575903 kubelet[2516]: I0116 21:16:50.575823 2516 factory.go:221] Registration of the systemd container factory successfully Jan 16 21:16:50.575942 kubelet[2516]: I0116 21:16:50.575901 2516 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 16 21:16:50.575000 audit[2528]: NETFILTER_CFG table=filter:43 family=2 entries=1 op=nft_register_chain pid=2528 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 21:16:50.575000 audit[2528]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fffb8727920 a2=0 a3=0 items=0 ppid=2516 pid=2528 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:50.575000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Jan 16 21:16:50.577899 kubelet[2516]: E0116 21:16:50.577875 2516 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 16 21:16:50.578056 kubelet[2516]: I0116 21:16:50.578045 2516 factory.go:221] Registration of the containerd container factory successfully Jan 16 21:16:50.578000 audit[2530]: NETFILTER_CFG table=filter:44 family=2 entries=2 op=nft_register_chain pid=2530 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 21:16:50.578000 audit[2530]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7fffc18ad720 a2=0 a3=0 items=0 ppid=2516 pid=2530 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:50.578000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 16 21:16:50.581000 audit[2532]: NETFILTER_CFG table=filter:45 family=2 entries=2 op=nft_register_chain pid=2532 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 21:16:50.581000 audit[2532]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7ffd3c184680 a2=0 a3=0 items=0 ppid=2516 pid=2532 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:50.581000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 16 21:16:50.588000 audit[2535]: NETFILTER_CFG table=filter:46 family=2 entries=1 op=nft_register_rule pid=2535 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 21:16:50.588000 audit[2535]: SYSCALL arch=c000003e syscall=46 success=yes exit=924 a0=3 a1=7fff10367430 a2=0 a3=0 items=0 ppid=2516 pid=2535 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:50.588000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D41004B5542452D4649524557414C4C002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E7400626C6F636B20696E636F6D696E67206C6F63616C6E657420636F6E6E656374696F6E73002D2D647374003132372E302E302E302F38 Jan 16 21:16:50.590840 kubelet[2516]: I0116 21:16:50.590559 2516 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 16 21:16:50.589000 audit[2537]: NETFILTER_CFG table=mangle:47 family=10 entries=2 op=nft_register_chain pid=2537 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 21:16:50.589000 audit[2537]: SYSCALL arch=c000003e syscall=46 success=yes exit=136 a0=3 a1=7ffff70805c0 a2=0 a3=0 items=0 ppid=2516 pid=2537 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:50.589000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Jan 16 21:16:50.591983 kubelet[2516]: I0116 21:16:50.591970 2516 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 16 21:16:50.592037 kubelet[2516]: I0116 21:16:50.592031 2516 status_manager.go:227] "Starting to sync pod status with apiserver" Jan 16 21:16:50.592085 kubelet[2516]: I0116 21:16:50.592080 2516 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jan 16 21:16:50.592422 kubelet[2516]: I0116 21:16:50.592126 2516 kubelet.go:2382] "Starting kubelet main sync loop" Jan 16 21:16:50.592422 kubelet[2516]: E0116 21:16:50.592169 2516 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 16 21:16:50.591000 audit[2538]: NETFILTER_CFG table=mangle:48 family=2 entries=1 op=nft_register_chain pid=2538 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 21:16:50.591000 audit[2538]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffd046e1700 a2=0 a3=0 items=0 ppid=2516 pid=2538 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:50.591000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Jan 16 21:16:50.592000 audit[2539]: NETFILTER_CFG table=nat:49 family=2 entries=1 op=nft_register_chain pid=2539 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 21:16:50.592000 audit[2539]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff827f2fd0 a2=0 a3=0 items=0 ppid=2516 pid=2539 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:50.592000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Jan 16 21:16:50.593000 audit[2542]: NETFILTER_CFG table=mangle:50 family=10 entries=1 op=nft_register_chain pid=2542 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 21:16:50.593000 audit[2542]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffde4088650 a2=0 a3=0 items=0 ppid=2516 pid=2542 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:50.593000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Jan 16 21:16:50.597288 kubelet[2516]: W0116 21:16:50.597253 2516 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.3.156:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.3.156:6443: connect: connection refused Jan 16 21:16:50.597379 kubelet[2516]: E0116 21:16:50.597367 2516 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.3.156:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.3.156:6443: connect: connection refused" logger="UnhandledError" Jan 16 21:16:50.596000 audit[2543]: NETFILTER_CFG table=nat:51 family=10 entries=1 op=nft_register_chain pid=2543 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 21:16:50.596000 audit[2543]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffee516d220 a2=0 a3=0 items=0 ppid=2516 pid=2543 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:50.596000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Jan 16 21:16:50.598000 audit[2544]: NETFILTER_CFG table=filter:52 family=2 entries=1 op=nft_register_chain pid=2544 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 21:16:50.598000 audit[2544]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffd32e00250 a2=0 a3=0 items=0 ppid=2516 pid=2544 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:50.598000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Jan 16 21:16:50.600000 audit[2547]: NETFILTER_CFG table=filter:53 family=10 entries=1 op=nft_register_chain pid=2547 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 21:16:50.600000 audit[2547]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffe0604c760 a2=0 a3=0 items=0 ppid=2516 pid=2547 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:50.600000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Jan 16 21:16:50.602611 kubelet[2516]: I0116 21:16:50.602401 2516 cpu_manager.go:221] "Starting CPU manager" policy="none" Jan 16 21:16:50.602611 kubelet[2516]: I0116 21:16:50.602412 2516 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jan 16 21:16:50.602611 kubelet[2516]: I0116 21:16:50.602425 2516 state_mem.go:36] "Initialized new in-memory state store" Jan 16 21:16:50.605457 kubelet[2516]: I0116 21:16:50.605440 2516 policy_none.go:49] "None policy: Start" Jan 16 21:16:50.605531 kubelet[2516]: I0116 21:16:50.605525 2516 memory_manager.go:186] "Starting memorymanager" policy="None" Jan 16 21:16:50.605572 kubelet[2516]: I0116 21:16:50.605567 2516 state_mem.go:35] "Initializing new in-memory state store" Jan 16 21:16:50.611818 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Jan 16 21:16:50.621742 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Jan 16 21:16:50.624773 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Jan 16 21:16:50.635733 kubelet[2516]: I0116 21:16:50.635685 2516 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 16 21:16:50.636000 kubelet[2516]: I0116 21:16:50.635990 2516 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 16 21:16:50.636410 kubelet[2516]: I0116 21:16:50.636382 2516 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 16 21:16:50.637032 kubelet[2516]: I0116 21:16:50.636967 2516 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 16 21:16:50.638359 kubelet[2516]: E0116 21:16:50.638338 2516 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jan 16 21:16:50.638408 kubelet[2516]: E0116 21:16:50.638375 2516 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4580-0-0-p-be73a47b79\" not found" Jan 16 21:16:50.704003 systemd[1]: Created slice kubepods-burstable-pod66a2eebef0db821f7955b8246b62ff17.slice - libcontainer container kubepods-burstable-pod66a2eebef0db821f7955b8246b62ff17.slice. Jan 16 21:16:50.717952 kubelet[2516]: E0116 21:16:50.717924 2516 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4580-0-0-p-be73a47b79\" not found" node="ci-4580-0-0-p-be73a47b79" Jan 16 21:16:50.720733 systemd[1]: Created slice kubepods-burstable-pod6f16329486c6643b17ffc1b877c3ba28.slice - libcontainer container kubepods-burstable-pod6f16329486c6643b17ffc1b877c3ba28.slice. Jan 16 21:16:50.722956 kubelet[2516]: E0116 21:16:50.722849 2516 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4580-0-0-p-be73a47b79\" not found" node="ci-4580-0-0-p-be73a47b79" Jan 16 21:16:50.727816 systemd[1]: Created slice kubepods-burstable-poda4df0e6682ad774e865e367503d900a5.slice - libcontainer container kubepods-burstable-poda4df0e6682ad774e865e367503d900a5.slice. Jan 16 21:16:50.729811 kubelet[2516]: E0116 21:16:50.729661 2516 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4580-0-0-p-be73a47b79\" not found" node="ci-4580-0-0-p-be73a47b79" Jan 16 21:16:50.738270 kubelet[2516]: I0116 21:16:50.738248 2516 kubelet_node_status.go:75] "Attempting to register node" node="ci-4580-0-0-p-be73a47b79" Jan 16 21:16:50.738684 kubelet[2516]: E0116 21:16:50.738666 2516 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.3.156:6443/api/v1/nodes\": dial tcp 10.0.3.156:6443: connect: connection refused" node="ci-4580-0-0-p-be73a47b79" Jan 16 21:16:50.773508 kubelet[2516]: E0116 21:16:50.773471 2516 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.3.156:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4580-0-0-p-be73a47b79?timeout=10s\": dial tcp 10.0.3.156:6443: connect: connection refused" interval="400ms" Jan 16 21:16:50.868080 kubelet[2516]: I0116 21:16:50.867867 2516 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/66a2eebef0db821f7955b8246b62ff17-ca-certs\") pod \"kube-apiserver-ci-4580-0-0-p-be73a47b79\" (UID: \"66a2eebef0db821f7955b8246b62ff17\") " pod="kube-system/kube-apiserver-ci-4580-0-0-p-be73a47b79" Jan 16 21:16:50.868080 kubelet[2516]: I0116 21:16:50.867908 2516 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/66a2eebef0db821f7955b8246b62ff17-k8s-certs\") pod \"kube-apiserver-ci-4580-0-0-p-be73a47b79\" (UID: \"66a2eebef0db821f7955b8246b62ff17\") " pod="kube-system/kube-apiserver-ci-4580-0-0-p-be73a47b79" Jan 16 21:16:50.868080 kubelet[2516]: I0116 21:16:50.867930 2516 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/66a2eebef0db821f7955b8246b62ff17-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4580-0-0-p-be73a47b79\" (UID: \"66a2eebef0db821f7955b8246b62ff17\") " pod="kube-system/kube-apiserver-ci-4580-0-0-p-be73a47b79" Jan 16 21:16:50.868080 kubelet[2516]: I0116 21:16:50.867949 2516 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/6f16329486c6643b17ffc1b877c3ba28-ca-certs\") pod \"kube-controller-manager-ci-4580-0-0-p-be73a47b79\" (UID: \"6f16329486c6643b17ffc1b877c3ba28\") " pod="kube-system/kube-controller-manager-ci-4580-0-0-p-be73a47b79" Jan 16 21:16:50.868080 kubelet[2516]: I0116 21:16:50.867977 2516 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/6f16329486c6643b17ffc1b877c3ba28-flexvolume-dir\") pod \"kube-controller-manager-ci-4580-0-0-p-be73a47b79\" (UID: \"6f16329486c6643b17ffc1b877c3ba28\") " pod="kube-system/kube-controller-manager-ci-4580-0-0-p-be73a47b79" Jan 16 21:16:50.868351 kubelet[2516]: I0116 21:16:50.867991 2516 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/6f16329486c6643b17ffc1b877c3ba28-k8s-certs\") pod \"kube-controller-manager-ci-4580-0-0-p-be73a47b79\" (UID: \"6f16329486c6643b17ffc1b877c3ba28\") " pod="kube-system/kube-controller-manager-ci-4580-0-0-p-be73a47b79" Jan 16 21:16:50.868351 kubelet[2516]: I0116 21:16:50.868006 2516 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/a4df0e6682ad774e865e367503d900a5-kubeconfig\") pod \"kube-scheduler-ci-4580-0-0-p-be73a47b79\" (UID: \"a4df0e6682ad774e865e367503d900a5\") " pod="kube-system/kube-scheduler-ci-4580-0-0-p-be73a47b79" Jan 16 21:16:50.868351 kubelet[2516]: I0116 21:16:50.868019 2516 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/6f16329486c6643b17ffc1b877c3ba28-kubeconfig\") pod \"kube-controller-manager-ci-4580-0-0-p-be73a47b79\" (UID: \"6f16329486c6643b17ffc1b877c3ba28\") " pod="kube-system/kube-controller-manager-ci-4580-0-0-p-be73a47b79" Jan 16 21:16:50.868351 kubelet[2516]: I0116 21:16:50.868035 2516 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/6f16329486c6643b17ffc1b877c3ba28-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4580-0-0-p-be73a47b79\" (UID: \"6f16329486c6643b17ffc1b877c3ba28\") " pod="kube-system/kube-controller-manager-ci-4580-0-0-p-be73a47b79" Jan 16 21:16:50.942346 kubelet[2516]: I0116 21:16:50.942275 2516 kubelet_node_status.go:75] "Attempting to register node" node="ci-4580-0-0-p-be73a47b79" Jan 16 21:16:50.943381 kubelet[2516]: E0116 21:16:50.943316 2516 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.3.156:6443/api/v1/nodes\": dial tcp 10.0.3.156:6443: connect: connection refused" node="ci-4580-0-0-p-be73a47b79" Jan 16 21:16:51.020361 containerd[1678]: time="2026-01-16T21:16:51.020183758Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4580-0-0-p-be73a47b79,Uid:66a2eebef0db821f7955b8246b62ff17,Namespace:kube-system,Attempt:0,}" Jan 16 21:16:51.027306 containerd[1678]: time="2026-01-16T21:16:51.027245125Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4580-0-0-p-be73a47b79,Uid:6f16329486c6643b17ffc1b877c3ba28,Namespace:kube-system,Attempt:0,}" Jan 16 21:16:51.031416 containerd[1678]: time="2026-01-16T21:16:51.031365110Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4580-0-0-p-be73a47b79,Uid:a4df0e6682ad774e865e367503d900a5,Namespace:kube-system,Attempt:0,}" Jan 16 21:16:51.081471 containerd[1678]: time="2026-01-16T21:16:51.081286487Z" level=info msg="connecting to shim 6008b5a6dc3b37075b35f333dbaf1bf4b7ece0402552888b2311d674c5638dd2" address="unix:///run/containerd/s/314bb4d53273f158f3a11c082a3992eada224b562e185d2b08256df536c95bad" namespace=k8s.io protocol=ttrpc version=3 Jan 16 21:16:51.083616 containerd[1678]: time="2026-01-16T21:16:51.083538489Z" level=info msg="connecting to shim 905553eceb2bbaf9d7d35a7d3e15c55531ed6930684ba1ea4b20824570cd4ff1" address="unix:///run/containerd/s/2bbd2590fd781d3445f3b4b6268b14f4a48654f8eb55a43532a5fb6dcc3bf322" namespace=k8s.io protocol=ttrpc version=3 Jan 16 21:16:51.098625 containerd[1678]: time="2026-01-16T21:16:51.098556594Z" level=info msg="connecting to shim d4fcc08819c1599601745ba6dae5e6676dcb7b547a1d80e7c0cb4e5c7027cb28" address="unix:///run/containerd/s/cb3b851f073d8fd6cfbd47c5c855c488e7d07ee86f18b65007b780e2e35bcdc4" namespace=k8s.io protocol=ttrpc version=3 Jan 16 21:16:51.128795 systemd[1]: Started cri-containerd-6008b5a6dc3b37075b35f333dbaf1bf4b7ece0402552888b2311d674c5638dd2.scope - libcontainer container 6008b5a6dc3b37075b35f333dbaf1bf4b7ece0402552888b2311d674c5638dd2. Jan 16 21:16:51.134411 systemd[1]: Started cri-containerd-d4fcc08819c1599601745ba6dae5e6676dcb7b547a1d80e7c0cb4e5c7027cb28.scope - libcontainer container d4fcc08819c1599601745ba6dae5e6676dcb7b547a1d80e7c0cb4e5c7027cb28. Jan 16 21:16:51.138381 systemd[1]: Started cri-containerd-905553eceb2bbaf9d7d35a7d3e15c55531ed6930684ba1ea4b20824570cd4ff1.scope - libcontainer container 905553eceb2bbaf9d7d35a7d3e15c55531ed6930684ba1ea4b20824570cd4ff1. Jan 16 21:16:51.146000 audit: BPF prog-id=83 op=LOAD Jan 16 21:16:51.146000 audit: BPF prog-id=84 op=LOAD Jan 16 21:16:51.146000 audit[2591]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=2561 pid=2591 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:51.146000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3630303862356136646333623337303735623335663333336462616631 Jan 16 21:16:51.146000 audit: BPF prog-id=84 op=UNLOAD Jan 16 21:16:51.146000 audit[2591]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2561 pid=2591 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:51.146000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3630303862356136646333623337303735623335663333336462616631 Jan 16 21:16:51.147000 audit: BPF prog-id=85 op=LOAD Jan 16 21:16:51.147000 audit[2591]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=2561 pid=2591 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:51.147000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3630303862356136646333623337303735623335663333336462616631 Jan 16 21:16:51.147000 audit: BPF prog-id=86 op=LOAD Jan 16 21:16:51.147000 audit[2591]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=2561 pid=2591 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:51.147000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3630303862356136646333623337303735623335663333336462616631 Jan 16 21:16:51.147000 audit: BPF prog-id=86 op=UNLOAD Jan 16 21:16:51.147000 audit[2591]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2561 pid=2591 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:51.147000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3630303862356136646333623337303735623335663333336462616631 Jan 16 21:16:51.147000 audit: BPF prog-id=85 op=UNLOAD Jan 16 21:16:51.147000 audit[2591]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2561 pid=2591 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:51.147000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3630303862356136646333623337303735623335663333336462616631 Jan 16 21:16:51.147000 audit: BPF prog-id=87 op=LOAD Jan 16 21:16:51.147000 audit[2591]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=2561 pid=2591 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:51.147000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3630303862356136646333623337303735623335663333336462616631 Jan 16 21:16:51.150000 audit: BPF prog-id=88 op=LOAD Jan 16 21:16:51.150000 audit: BPF prog-id=89 op=LOAD Jan 16 21:16:51.150000 audit[2602]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000106238 a2=98 a3=0 items=0 ppid=2570 pid=2602 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:51.150000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3930353535336563656232626261663964376433356137643365313563 Jan 16 21:16:51.150000 audit: BPF prog-id=89 op=UNLOAD Jan 16 21:16:51.150000 audit[2602]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2570 pid=2602 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:51.150000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3930353535336563656232626261663964376433356137643365313563 Jan 16 21:16:51.151000 audit: BPF prog-id=90 op=LOAD Jan 16 21:16:51.151000 audit[2602]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000106488 a2=98 a3=0 items=0 ppid=2570 pid=2602 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:51.151000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3930353535336563656232626261663964376433356137643365313563 Jan 16 21:16:51.151000 audit: BPF prog-id=91 op=LOAD Jan 16 21:16:51.151000 audit[2602]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000106218 a2=98 a3=0 items=0 ppid=2570 pid=2602 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:51.151000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3930353535336563656232626261663964376433356137643365313563 Jan 16 21:16:51.151000 audit: BPF prog-id=91 op=UNLOAD Jan 16 21:16:51.151000 audit[2602]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2570 pid=2602 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:51.151000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3930353535336563656232626261663964376433356137643365313563 Jan 16 21:16:51.151000 audit: BPF prog-id=90 op=UNLOAD Jan 16 21:16:51.151000 audit[2602]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2570 pid=2602 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:51.151000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3930353535336563656232626261663964376433356137643365313563 Jan 16 21:16:51.151000 audit: BPF prog-id=92 op=LOAD Jan 16 21:16:51.151000 audit[2602]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001066e8 a2=98 a3=0 items=0 ppid=2570 pid=2602 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:51.151000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3930353535336563656232626261663964376433356137643365313563 Jan 16 21:16:51.156000 audit: BPF prog-id=93 op=LOAD Jan 16 21:16:51.157000 audit: BPF prog-id=94 op=LOAD Jan 16 21:16:51.157000 audit[2618]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b0238 a2=98 a3=0 items=0 ppid=2593 pid=2618 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:51.157000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6434666363303838313963313539393630313734356261366461653565 Jan 16 21:16:51.157000 audit: BPF prog-id=94 op=UNLOAD Jan 16 21:16:51.157000 audit[2618]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2593 pid=2618 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:51.157000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6434666363303838313963313539393630313734356261366461653565 Jan 16 21:16:51.157000 audit: BPF prog-id=95 op=LOAD Jan 16 21:16:51.157000 audit[2618]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b0488 a2=98 a3=0 items=0 ppid=2593 pid=2618 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:51.157000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6434666363303838313963313539393630313734356261366461653565 Jan 16 21:16:51.157000 audit: BPF prog-id=96 op=LOAD Jan 16 21:16:51.157000 audit[2618]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001b0218 a2=98 a3=0 items=0 ppid=2593 pid=2618 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:51.157000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6434666363303838313963313539393630313734356261366461653565 Jan 16 21:16:51.157000 audit: BPF prog-id=96 op=UNLOAD Jan 16 21:16:51.157000 audit[2618]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2593 pid=2618 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:51.157000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6434666363303838313963313539393630313734356261366461653565 Jan 16 21:16:51.157000 audit: BPF prog-id=95 op=UNLOAD Jan 16 21:16:51.157000 audit[2618]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2593 pid=2618 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:51.157000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6434666363303838313963313539393630313734356261366461653565 Jan 16 21:16:51.157000 audit: BPF prog-id=97 op=LOAD Jan 16 21:16:51.157000 audit[2618]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b06e8 a2=98 a3=0 items=0 ppid=2593 pid=2618 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:51.157000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6434666363303838313963313539393630313734356261366461653565 Jan 16 21:16:51.174633 kubelet[2516]: E0116 21:16:51.174144 2516 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.3.156:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4580-0-0-p-be73a47b79?timeout=10s\": dial tcp 10.0.3.156:6443: connect: connection refused" interval="800ms" Jan 16 21:16:51.201672 containerd[1678]: time="2026-01-16T21:16:51.201247020Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4580-0-0-p-be73a47b79,Uid:66a2eebef0db821f7955b8246b62ff17,Namespace:kube-system,Attempt:0,} returns sandbox id \"6008b5a6dc3b37075b35f333dbaf1bf4b7ece0402552888b2311d674c5638dd2\"" Jan 16 21:16:51.210092 containerd[1678]: time="2026-01-16T21:16:51.210055149Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4580-0-0-p-be73a47b79,Uid:a4df0e6682ad774e865e367503d900a5,Namespace:kube-system,Attempt:0,} returns sandbox id \"905553eceb2bbaf9d7d35a7d3e15c55531ed6930684ba1ea4b20824570cd4ff1\"" Jan 16 21:16:51.210521 containerd[1678]: time="2026-01-16T21:16:51.210447570Z" level=info msg="CreateContainer within sandbox \"6008b5a6dc3b37075b35f333dbaf1bf4b7ece0402552888b2311d674c5638dd2\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Jan 16 21:16:51.212635 containerd[1678]: time="2026-01-16T21:16:51.212494800Z" level=info msg="CreateContainer within sandbox \"905553eceb2bbaf9d7d35a7d3e15c55531ed6930684ba1ea4b20824570cd4ff1\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Jan 16 21:16:51.227966 containerd[1678]: time="2026-01-16T21:16:51.227918104Z" level=info msg="Container 640093b186e811968b568e375d53a063a38403d48a6e351ff423d94d3fe0b435: CDI devices from CRI Config.CDIDevices: []" Jan 16 21:16:51.254048 containerd[1678]: time="2026-01-16T21:16:51.253999499Z" level=info msg="CreateContainer within sandbox \"905553eceb2bbaf9d7d35a7d3e15c55531ed6930684ba1ea4b20824570cd4ff1\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"640093b186e811968b568e375d53a063a38403d48a6e351ff423d94d3fe0b435\"" Jan 16 21:16:51.254641 containerd[1678]: time="2026-01-16T21:16:51.254586155Z" level=info msg="StartContainer for \"640093b186e811968b568e375d53a063a38403d48a6e351ff423d94d3fe0b435\"" Jan 16 21:16:51.255616 containerd[1678]: time="2026-01-16T21:16:51.255573030Z" level=info msg="connecting to shim 640093b186e811968b568e375d53a063a38403d48a6e351ff423d94d3fe0b435" address="unix:///run/containerd/s/2bbd2590fd781d3445f3b4b6268b14f4a48654f8eb55a43532a5fb6dcc3bf322" protocol=ttrpc version=3 Jan 16 21:16:51.261484 containerd[1678]: time="2026-01-16T21:16:51.261352814Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4580-0-0-p-be73a47b79,Uid:6f16329486c6643b17ffc1b877c3ba28,Namespace:kube-system,Attempt:0,} returns sandbox id \"d4fcc08819c1599601745ba6dae5e6676dcb7b547a1d80e7c0cb4e5c7027cb28\"" Jan 16 21:16:51.263994 containerd[1678]: time="2026-01-16T21:16:51.263961363Z" level=info msg="CreateContainer within sandbox \"d4fcc08819c1599601745ba6dae5e6676dcb7b547a1d80e7c0cb4e5c7027cb28\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Jan 16 21:16:51.267327 containerd[1678]: time="2026-01-16T21:16:51.267242822Z" level=info msg="Container 5ff3a789d055e287b31eef57d1c92482579f8dcc6efdfa13b8539145c4cf5798: CDI devices from CRI Config.CDIDevices: []" Jan 16 21:16:51.278643 containerd[1678]: time="2026-01-16T21:16:51.278311716Z" level=info msg="CreateContainer within sandbox \"6008b5a6dc3b37075b35f333dbaf1bf4b7ece0402552888b2311d674c5638dd2\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"5ff3a789d055e287b31eef57d1c92482579f8dcc6efdfa13b8539145c4cf5798\"" Jan 16 21:16:51.278987 systemd[1]: Started cri-containerd-640093b186e811968b568e375d53a063a38403d48a6e351ff423d94d3fe0b435.scope - libcontainer container 640093b186e811968b568e375d53a063a38403d48a6e351ff423d94d3fe0b435. Jan 16 21:16:51.280807 containerd[1678]: time="2026-01-16T21:16:51.280503771Z" level=info msg="StartContainer for \"5ff3a789d055e287b31eef57d1c92482579f8dcc6efdfa13b8539145c4cf5798\"" Jan 16 21:16:51.282063 containerd[1678]: time="2026-01-16T21:16:51.281762193Z" level=info msg="connecting to shim 5ff3a789d055e287b31eef57d1c92482579f8dcc6efdfa13b8539145c4cf5798" address="unix:///run/containerd/s/314bb4d53273f158f3a11c082a3992eada224b562e185d2b08256df536c95bad" protocol=ttrpc version=3 Jan 16 21:16:51.285212 containerd[1678]: time="2026-01-16T21:16:51.285190810Z" level=info msg="Container cc391d116d15810d51975eb4225ad31f2e3f71263bfdc54304997355dce9caae: CDI devices from CRI Config.CDIDevices: []" Jan 16 21:16:51.296717 containerd[1678]: time="2026-01-16T21:16:51.295651999Z" level=info msg="CreateContainer within sandbox \"d4fcc08819c1599601745ba6dae5e6676dcb7b547a1d80e7c0cb4e5c7027cb28\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"cc391d116d15810d51975eb4225ad31f2e3f71263bfdc54304997355dce9caae\"" Jan 16 21:16:51.297051 containerd[1678]: time="2026-01-16T21:16:51.297031985Z" level=info msg="StartContainer for \"cc391d116d15810d51975eb4225ad31f2e3f71263bfdc54304997355dce9caae\"" Jan 16 21:16:51.300716 containerd[1678]: time="2026-01-16T21:16:51.300647513Z" level=info msg="connecting to shim cc391d116d15810d51975eb4225ad31f2e3f71263bfdc54304997355dce9caae" address="unix:///run/containerd/s/cb3b851f073d8fd6cfbd47c5c855c488e7d07ee86f18b65007b780e2e35bcdc4" protocol=ttrpc version=3 Jan 16 21:16:51.299000 audit: BPF prog-id=98 op=LOAD Jan 16 21:16:51.300000 audit: BPF prog-id=99 op=LOAD Jan 16 21:16:51.300000 audit[2689]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=2570 pid=2689 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:51.300000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634303039336231383665383131393638623536386533373564353361 Jan 16 21:16:51.301000 audit: BPF prog-id=99 op=UNLOAD Jan 16 21:16:51.301000 audit[2689]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2570 pid=2689 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:51.301000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634303039336231383665383131393638623536386533373564353361 Jan 16 21:16:51.301000 audit: BPF prog-id=100 op=LOAD Jan 16 21:16:51.301000 audit[2689]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=2570 pid=2689 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:51.301000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634303039336231383665383131393638623536386533373564353361 Jan 16 21:16:51.301000 audit: BPF prog-id=101 op=LOAD Jan 16 21:16:51.301000 audit[2689]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=2570 pid=2689 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:51.301000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634303039336231383665383131393638623536386533373564353361 Jan 16 21:16:51.301000 audit: BPF prog-id=101 op=UNLOAD Jan 16 21:16:51.301000 audit[2689]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2570 pid=2689 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:51.301000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634303039336231383665383131393638623536386533373564353361 Jan 16 21:16:51.301000 audit: BPF prog-id=100 op=UNLOAD Jan 16 21:16:51.301000 audit[2689]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2570 pid=2689 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:51.301000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634303039336231383665383131393638623536386533373564353361 Jan 16 21:16:51.301000 audit: BPF prog-id=102 op=LOAD Jan 16 21:16:51.301000 audit[2689]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=2570 pid=2689 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:51.301000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634303039336231383665383131393638623536386533373564353361 Jan 16 21:16:51.308768 systemd[1]: Started cri-containerd-5ff3a789d055e287b31eef57d1c92482579f8dcc6efdfa13b8539145c4cf5798.scope - libcontainer container 5ff3a789d055e287b31eef57d1c92482579f8dcc6efdfa13b8539145c4cf5798. Jan 16 21:16:51.326758 systemd[1]: Started cri-containerd-cc391d116d15810d51975eb4225ad31f2e3f71263bfdc54304997355dce9caae.scope - libcontainer container cc391d116d15810d51975eb4225ad31f2e3f71263bfdc54304997355dce9caae. Jan 16 21:16:51.335000 audit: BPF prog-id=103 op=LOAD Jan 16 21:16:51.336000 audit: BPF prog-id=104 op=LOAD Jan 16 21:16:51.336000 audit[2702]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=2561 pid=2702 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:51.336000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3566663361373839643035356532383762333165656635376431633932 Jan 16 21:16:51.336000 audit: BPF prog-id=104 op=UNLOAD Jan 16 21:16:51.336000 audit[2702]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2561 pid=2702 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:51.336000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3566663361373839643035356532383762333165656635376431633932 Jan 16 21:16:51.336000 audit: BPF prog-id=105 op=LOAD Jan 16 21:16:51.336000 audit[2702]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=2561 pid=2702 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:51.336000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3566663361373839643035356532383762333165656635376431633932 Jan 16 21:16:51.336000 audit: BPF prog-id=106 op=LOAD Jan 16 21:16:51.336000 audit[2702]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=2561 pid=2702 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:51.336000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3566663361373839643035356532383762333165656635376431633932 Jan 16 21:16:51.336000 audit: BPF prog-id=106 op=UNLOAD Jan 16 21:16:51.336000 audit[2702]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2561 pid=2702 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:51.336000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3566663361373839643035356532383762333165656635376431633932 Jan 16 21:16:51.336000 audit: BPF prog-id=105 op=UNLOAD Jan 16 21:16:51.336000 audit[2702]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2561 pid=2702 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:51.336000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3566663361373839643035356532383762333165656635376431633932 Jan 16 21:16:51.336000 audit: BPF prog-id=107 op=LOAD Jan 16 21:16:51.336000 audit[2702]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=2561 pid=2702 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:51.336000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3566663361373839643035356532383762333165656635376431633932 Jan 16 21:16:51.347165 kubelet[2516]: I0116 21:16:51.346517 2516 kubelet_node_status.go:75] "Attempting to register node" node="ci-4580-0-0-p-be73a47b79" Jan 16 21:16:51.347515 kubelet[2516]: E0116 21:16:51.347484 2516 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.3.156:6443/api/v1/nodes\": dial tcp 10.0.3.156:6443: connect: connection refused" node="ci-4580-0-0-p-be73a47b79" Jan 16 21:16:51.347000 audit: BPF prog-id=108 op=LOAD Jan 16 21:16:51.348000 audit: BPF prog-id=109 op=LOAD Jan 16 21:16:51.348000 audit[2722]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=2593 pid=2722 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:51.348000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6363333931643131366431353831306435313937356562343232356164 Jan 16 21:16:51.348000 audit: BPF prog-id=109 op=UNLOAD Jan 16 21:16:51.348000 audit[2722]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2593 pid=2722 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:51.348000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6363333931643131366431353831306435313937356562343232356164 Jan 16 21:16:51.348000 audit: BPF prog-id=110 op=LOAD Jan 16 21:16:51.348000 audit[2722]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=2593 pid=2722 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:51.348000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6363333931643131366431353831306435313937356562343232356164 Jan 16 21:16:51.349000 audit: BPF prog-id=111 op=LOAD Jan 16 21:16:51.349000 audit[2722]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=2593 pid=2722 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:51.349000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6363333931643131366431353831306435313937356562343232356164 Jan 16 21:16:51.349000 audit: BPF prog-id=111 op=UNLOAD Jan 16 21:16:51.349000 audit[2722]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2593 pid=2722 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:51.349000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6363333931643131366431353831306435313937356562343232356164 Jan 16 21:16:51.349000 audit: BPF prog-id=110 op=UNLOAD Jan 16 21:16:51.349000 audit[2722]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2593 pid=2722 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:51.349000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6363333931643131366431353831306435313937356562343232356164 Jan 16 21:16:51.349000 audit: BPF prog-id=112 op=LOAD Jan 16 21:16:51.349000 audit[2722]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=2593 pid=2722 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:16:51.349000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6363333931643131366431353831306435313937356562343232356164 Jan 16 21:16:51.362386 containerd[1678]: time="2026-01-16T21:16:51.362239537Z" level=info msg="StartContainer for \"640093b186e811968b568e375d53a063a38403d48a6e351ff423d94d3fe0b435\" returns successfully" Jan 16 21:16:51.409514 containerd[1678]: time="2026-01-16T21:16:51.409434083Z" level=info msg="StartContainer for \"cc391d116d15810d51975eb4225ad31f2e3f71263bfdc54304997355dce9caae\" returns successfully" Jan 16 21:16:51.413794 containerd[1678]: time="2026-01-16T21:16:51.413771215Z" level=info msg="StartContainer for \"5ff3a789d055e287b31eef57d1c92482579f8dcc6efdfa13b8539145c4cf5798\" returns successfully" Jan 16 21:16:51.465969 kubelet[2516]: W0116 21:16:51.465662 2516 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.3.156:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.3.156:6443: connect: connection refused Jan 16 21:16:51.465969 kubelet[2516]: E0116 21:16:51.465730 2516 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.3.156:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.3.156:6443: connect: connection refused" logger="UnhandledError" Jan 16 21:16:51.610961 kubelet[2516]: E0116 21:16:51.608250 2516 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4580-0-0-p-be73a47b79\" not found" node="ci-4580-0-0-p-be73a47b79" Jan 16 21:16:51.611702 kubelet[2516]: E0116 21:16:51.611685 2516 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4580-0-0-p-be73a47b79\" not found" node="ci-4580-0-0-p-be73a47b79" Jan 16 21:16:51.613664 kubelet[2516]: E0116 21:16:51.613647 2516 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4580-0-0-p-be73a47b79\" not found" node="ci-4580-0-0-p-be73a47b79" Jan 16 21:16:52.149613 kubelet[2516]: I0116 21:16:52.149581 2516 kubelet_node_status.go:75] "Attempting to register node" node="ci-4580-0-0-p-be73a47b79" Jan 16 21:16:52.616906 kubelet[2516]: E0116 21:16:52.616828 2516 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4580-0-0-p-be73a47b79\" not found" node="ci-4580-0-0-p-be73a47b79" Jan 16 21:16:52.617194 kubelet[2516]: E0116 21:16:52.617017 2516 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4580-0-0-p-be73a47b79\" not found" node="ci-4580-0-0-p-be73a47b79" Jan 16 21:16:52.869259 kubelet[2516]: E0116 21:16:52.869159 2516 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4580-0-0-p-be73a47b79\" not found" node="ci-4580-0-0-p-be73a47b79" Jan 16 21:16:52.916748 kubelet[2516]: I0116 21:16:52.916715 2516 kubelet_node_status.go:78] "Successfully registered node" node="ci-4580-0-0-p-be73a47b79" Jan 16 21:16:52.972332 kubelet[2516]: I0116 21:16:52.972291 2516 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4580-0-0-p-be73a47b79" Jan 16 21:16:52.978075 kubelet[2516]: E0116 21:16:52.978048 2516 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4580-0-0-p-be73a47b79\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4580-0-0-p-be73a47b79" Jan 16 21:16:52.978075 kubelet[2516]: I0116 21:16:52.978077 2516 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4580-0-0-p-be73a47b79" Jan 16 21:16:52.979918 kubelet[2516]: E0116 21:16:52.979899 2516 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4580-0-0-p-be73a47b79\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4580-0-0-p-be73a47b79" Jan 16 21:16:52.979944 kubelet[2516]: I0116 21:16:52.979922 2516 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4580-0-0-p-be73a47b79" Jan 16 21:16:52.985869 kubelet[2516]: E0116 21:16:52.985826 2516 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4580-0-0-p-be73a47b79\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4580-0-0-p-be73a47b79" Jan 16 21:16:53.551613 kubelet[2516]: I0116 21:16:53.551569 2516 apiserver.go:52] "Watching apiserver" Jan 16 21:16:53.567044 kubelet[2516]: I0116 21:16:53.566988 2516 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Jan 16 21:16:53.615798 kubelet[2516]: I0116 21:16:53.615766 2516 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4580-0-0-p-be73a47b79" Jan 16 21:16:53.617767 kubelet[2516]: E0116 21:16:53.617731 2516 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4580-0-0-p-be73a47b79\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4580-0-0-p-be73a47b79" Jan 16 21:16:55.081362 systemd[1]: Reload requested from client PID 2785 ('systemctl') (unit session-10.scope)... Jan 16 21:16:55.081378 systemd[1]: Reloading... Jan 16 21:16:55.158755 zram_generator::config[2840]: No configuration found. Jan 16 21:16:55.354182 systemd[1]: Reloading finished in 272 ms. Jan 16 21:16:55.385650 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jan 16 21:16:55.397518 systemd[1]: kubelet.service: Deactivated successfully. Jan 16 21:16:55.397830 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 16 21:16:55.396000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:55.398847 kernel: kauditd_printk_skb: 204 callbacks suppressed Jan 16 21:16:55.398904 kernel: audit: type=1131 audit(1768598215.396:399): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:55.401385 systemd[1]: kubelet.service: Consumed 718ms CPU time, 129.8M memory peak. Jan 16 21:16:55.403354 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 16 21:16:55.407614 kernel: audit: type=1334 audit(1768598215.402:400): prog-id=113 op=LOAD Jan 16 21:16:55.407667 kernel: audit: type=1334 audit(1768598215.402:401): prog-id=69 op=UNLOAD Jan 16 21:16:55.407686 kernel: audit: type=1334 audit(1768598215.402:402): prog-id=114 op=LOAD Jan 16 21:16:55.402000 audit: BPF prog-id=113 op=LOAD Jan 16 21:16:55.402000 audit: BPF prog-id=69 op=UNLOAD Jan 16 21:16:55.402000 audit: BPF prog-id=114 op=LOAD Jan 16 21:16:55.412731 kernel: audit: type=1334 audit(1768598215.402:403): prog-id=115 op=LOAD Jan 16 21:16:55.412815 kernel: audit: type=1334 audit(1768598215.402:404): prog-id=70 op=UNLOAD Jan 16 21:16:55.412836 kernel: audit: type=1334 audit(1768598215.402:405): prog-id=71 op=UNLOAD Jan 16 21:16:55.402000 audit: BPF prog-id=115 op=LOAD Jan 16 21:16:55.402000 audit: BPF prog-id=70 op=UNLOAD Jan 16 21:16:55.402000 audit: BPF prog-id=71 op=UNLOAD Jan 16 21:16:55.402000 audit: BPF prog-id=116 op=LOAD Jan 16 21:16:55.402000 audit: BPF prog-id=72 op=UNLOAD Jan 16 21:16:55.415796 kernel: audit: type=1334 audit(1768598215.402:406): prog-id=116 op=LOAD Jan 16 21:16:55.415830 kernel: audit: type=1334 audit(1768598215.402:407): prog-id=72 op=UNLOAD Jan 16 21:16:55.415848 kernel: audit: type=1334 audit(1768598215.403:408): prog-id=117 op=LOAD Jan 16 21:16:55.403000 audit: BPF prog-id=117 op=LOAD Jan 16 21:16:55.403000 audit: BPF prog-id=118 op=LOAD Jan 16 21:16:55.403000 audit: BPF prog-id=78 op=UNLOAD Jan 16 21:16:55.403000 audit: BPF prog-id=79 op=UNLOAD Jan 16 21:16:55.404000 audit: BPF prog-id=119 op=LOAD Jan 16 21:16:55.404000 audit: BPF prog-id=63 op=UNLOAD Jan 16 21:16:55.404000 audit: BPF prog-id=120 op=LOAD Jan 16 21:16:55.404000 audit: BPF prog-id=121 op=LOAD Jan 16 21:16:55.404000 audit: BPF prog-id=64 op=UNLOAD Jan 16 21:16:55.404000 audit: BPF prog-id=65 op=UNLOAD Jan 16 21:16:55.404000 audit: BPF prog-id=122 op=LOAD Jan 16 21:16:55.404000 audit: BPF prog-id=66 op=UNLOAD Jan 16 21:16:55.404000 audit: BPF prog-id=123 op=LOAD Jan 16 21:16:55.404000 audit: BPF prog-id=124 op=LOAD Jan 16 21:16:55.404000 audit: BPF prog-id=67 op=UNLOAD Jan 16 21:16:55.404000 audit: BPF prog-id=68 op=UNLOAD Jan 16 21:16:55.406000 audit: BPF prog-id=125 op=LOAD Jan 16 21:16:55.406000 audit: BPF prog-id=76 op=UNLOAD Jan 16 21:16:55.406000 audit: BPF prog-id=126 op=LOAD Jan 16 21:16:55.406000 audit: BPF prog-id=73 op=UNLOAD Jan 16 21:16:55.406000 audit: BPF prog-id=127 op=LOAD Jan 16 21:16:55.406000 audit: BPF prog-id=128 op=LOAD Jan 16 21:16:55.406000 audit: BPF prog-id=74 op=UNLOAD Jan 16 21:16:55.406000 audit: BPF prog-id=75 op=UNLOAD Jan 16 21:16:55.407000 audit: BPF prog-id=129 op=LOAD Jan 16 21:16:55.407000 audit: BPF prog-id=77 op=UNLOAD Jan 16 21:16:55.408000 audit: BPF prog-id=130 op=LOAD Jan 16 21:16:55.408000 audit: BPF prog-id=80 op=UNLOAD Jan 16 21:16:55.408000 audit: BPF prog-id=131 op=LOAD Jan 16 21:16:55.408000 audit: BPF prog-id=132 op=LOAD Jan 16 21:16:55.408000 audit: BPF prog-id=81 op=UNLOAD Jan 16 21:16:55.408000 audit: BPF prog-id=82 op=UNLOAD Jan 16 21:16:55.548552 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 16 21:16:55.547000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:16:55.558335 (kubelet)[2882]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 16 21:16:55.612564 kubelet[2882]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 16 21:16:55.612564 kubelet[2882]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jan 16 21:16:55.612564 kubelet[2882]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 16 21:16:55.612872 kubelet[2882]: I0116 21:16:55.612695 2882 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 16 21:16:55.619349 kubelet[2882]: I0116 21:16:55.618821 2882 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Jan 16 21:16:55.619349 kubelet[2882]: I0116 21:16:55.618844 2882 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 16 21:16:55.619349 kubelet[2882]: I0116 21:16:55.619074 2882 server.go:954] "Client rotation is on, will bootstrap in background" Jan 16 21:16:55.623174 kubelet[2882]: I0116 21:16:55.623153 2882 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jan 16 21:16:55.625202 kubelet[2882]: I0116 21:16:55.625186 2882 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 16 21:16:55.628444 kubelet[2882]: I0116 21:16:55.628427 2882 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 16 21:16:55.631100 kubelet[2882]: I0116 21:16:55.631076 2882 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 16 21:16:55.631253 kubelet[2882]: I0116 21:16:55.631230 2882 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 16 21:16:55.631411 kubelet[2882]: I0116 21:16:55.631253 2882 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4580-0-0-p-be73a47b79","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 16 21:16:55.631492 kubelet[2882]: I0116 21:16:55.631414 2882 topology_manager.go:138] "Creating topology manager with none policy" Jan 16 21:16:55.631492 kubelet[2882]: I0116 21:16:55.631422 2882 container_manager_linux.go:304] "Creating device plugin manager" Jan 16 21:16:55.631492 kubelet[2882]: I0116 21:16:55.631461 2882 state_mem.go:36] "Initialized new in-memory state store" Jan 16 21:16:55.631601 kubelet[2882]: I0116 21:16:55.631583 2882 kubelet.go:446] "Attempting to sync node with API server" Jan 16 21:16:55.632064 kubelet[2882]: I0116 21:16:55.632051 2882 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 16 21:16:55.632092 kubelet[2882]: I0116 21:16:55.632081 2882 kubelet.go:352] "Adding apiserver pod source" Jan 16 21:16:55.632113 kubelet[2882]: I0116 21:16:55.632094 2882 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 16 21:16:55.633341 kubelet[2882]: I0116 21:16:55.633325 2882 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Jan 16 21:16:55.634616 kubelet[2882]: I0116 21:16:55.633705 2882 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 16 21:16:55.634616 kubelet[2882]: I0116 21:16:55.634055 2882 watchdog_linux.go:99] "Systemd watchdog is not enabled" Jan 16 21:16:55.634616 kubelet[2882]: I0116 21:16:55.634079 2882 server.go:1287] "Started kubelet" Jan 16 21:16:55.642451 kubelet[2882]: I0116 21:16:55.642423 2882 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 16 21:16:55.649161 kubelet[2882]: I0116 21:16:55.648560 2882 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 16 21:16:55.652151 kubelet[2882]: I0116 21:16:55.652133 2882 volume_manager.go:297] "Starting Kubelet Volume Manager" Jan 16 21:16:55.652366 kubelet[2882]: E0116 21:16:55.652342 2882 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4580-0-0-p-be73a47b79\" not found" Jan 16 21:16:55.653994 kubelet[2882]: I0116 21:16:55.653980 2882 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Jan 16 21:16:55.654107 kubelet[2882]: I0116 21:16:55.654097 2882 reconciler.go:26] "Reconciler: start to sync state" Jan 16 21:16:55.657549 kubelet[2882]: I0116 21:16:55.655637 2882 factory.go:221] Registration of the systemd container factory successfully Jan 16 21:16:55.657549 kubelet[2882]: I0116 21:16:55.655717 2882 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 16 21:16:55.657549 kubelet[2882]: I0116 21:16:55.656582 2882 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 16 21:16:55.657549 kubelet[2882]: I0116 21:16:55.656793 2882 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 16 21:16:55.657549 kubelet[2882]: I0116 21:16:55.656838 2882 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Jan 16 21:16:55.658230 kubelet[2882]: I0116 21:16:55.657885 2882 server.go:479] "Adding debug handlers to kubelet server" Jan 16 21:16:55.660058 kubelet[2882]: E0116 21:16:55.660041 2882 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 16 21:16:55.661745 kubelet[2882]: I0116 21:16:55.661725 2882 factory.go:221] Registration of the containerd container factory successfully Jan 16 21:16:55.667646 kubelet[2882]: I0116 21:16:55.667555 2882 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 16 21:16:55.668523 kubelet[2882]: I0116 21:16:55.668503 2882 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 16 21:16:55.668568 kubelet[2882]: I0116 21:16:55.668528 2882 status_manager.go:227] "Starting to sync pod status with apiserver" Jan 16 21:16:55.668568 kubelet[2882]: I0116 21:16:55.668545 2882 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jan 16 21:16:55.668568 kubelet[2882]: I0116 21:16:55.668553 2882 kubelet.go:2382] "Starting kubelet main sync loop" Jan 16 21:16:55.668660 kubelet[2882]: E0116 21:16:55.668602 2882 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 16 21:16:55.712203 kubelet[2882]: I0116 21:16:55.711256 2882 cpu_manager.go:221] "Starting CPU manager" policy="none" Jan 16 21:16:55.712203 kubelet[2882]: I0116 21:16:55.711271 2882 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jan 16 21:16:55.712203 kubelet[2882]: I0116 21:16:55.711289 2882 state_mem.go:36] "Initialized new in-memory state store" Jan 16 21:16:55.712203 kubelet[2882]: I0116 21:16:55.711439 2882 state_mem.go:88] "Updated default CPUSet" cpuSet="" Jan 16 21:16:55.712203 kubelet[2882]: I0116 21:16:55.711448 2882 state_mem.go:96] "Updated CPUSet assignments" assignments={} Jan 16 21:16:55.712203 kubelet[2882]: I0116 21:16:55.711463 2882 policy_none.go:49] "None policy: Start" Jan 16 21:16:55.712203 kubelet[2882]: I0116 21:16:55.711473 2882 memory_manager.go:186] "Starting memorymanager" policy="None" Jan 16 21:16:55.712203 kubelet[2882]: I0116 21:16:55.711482 2882 state_mem.go:35] "Initializing new in-memory state store" Jan 16 21:16:55.712203 kubelet[2882]: I0116 21:16:55.711569 2882 state_mem.go:75] "Updated machine memory state" Jan 16 21:16:55.715304 kubelet[2882]: I0116 21:16:55.715282 2882 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 16 21:16:55.715899 kubelet[2882]: I0116 21:16:55.715883 2882 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 16 21:16:55.715943 kubelet[2882]: I0116 21:16:55.715897 2882 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 16 21:16:55.716211 kubelet[2882]: I0116 21:16:55.716196 2882 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 16 21:16:55.719008 kubelet[2882]: E0116 21:16:55.718608 2882 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jan 16 21:16:55.770079 kubelet[2882]: I0116 21:16:55.770038 2882 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4580-0-0-p-be73a47b79" Jan 16 21:16:55.770274 kubelet[2882]: I0116 21:16:55.770092 2882 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4580-0-0-p-be73a47b79" Jan 16 21:16:55.770381 kubelet[2882]: I0116 21:16:55.770175 2882 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4580-0-0-p-be73a47b79" Jan 16 21:16:55.818791 kubelet[2882]: I0116 21:16:55.818762 2882 kubelet_node_status.go:75] "Attempting to register node" node="ci-4580-0-0-p-be73a47b79" Jan 16 21:16:55.829636 kubelet[2882]: I0116 21:16:55.829267 2882 kubelet_node_status.go:124] "Node was previously registered" node="ci-4580-0-0-p-be73a47b79" Jan 16 21:16:55.829636 kubelet[2882]: I0116 21:16:55.829342 2882 kubelet_node_status.go:78] "Successfully registered node" node="ci-4580-0-0-p-be73a47b79" Jan 16 21:16:55.955165 kubelet[2882]: I0116 21:16:55.955109 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/66a2eebef0db821f7955b8246b62ff17-k8s-certs\") pod \"kube-apiserver-ci-4580-0-0-p-be73a47b79\" (UID: \"66a2eebef0db821f7955b8246b62ff17\") " pod="kube-system/kube-apiserver-ci-4580-0-0-p-be73a47b79" Jan 16 21:16:55.955165 kubelet[2882]: I0116 21:16:55.955145 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/66a2eebef0db821f7955b8246b62ff17-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4580-0-0-p-be73a47b79\" (UID: \"66a2eebef0db821f7955b8246b62ff17\") " pod="kube-system/kube-apiserver-ci-4580-0-0-p-be73a47b79" Jan 16 21:16:55.955165 kubelet[2882]: I0116 21:16:55.955165 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/6f16329486c6643b17ffc1b877c3ba28-flexvolume-dir\") pod \"kube-controller-manager-ci-4580-0-0-p-be73a47b79\" (UID: \"6f16329486c6643b17ffc1b877c3ba28\") " pod="kube-system/kube-controller-manager-ci-4580-0-0-p-be73a47b79" Jan 16 21:16:55.955352 kubelet[2882]: I0116 21:16:55.955182 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/6f16329486c6643b17ffc1b877c3ba28-kubeconfig\") pod \"kube-controller-manager-ci-4580-0-0-p-be73a47b79\" (UID: \"6f16329486c6643b17ffc1b877c3ba28\") " pod="kube-system/kube-controller-manager-ci-4580-0-0-p-be73a47b79" Jan 16 21:16:55.955352 kubelet[2882]: I0116 21:16:55.955198 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/a4df0e6682ad774e865e367503d900a5-kubeconfig\") pod \"kube-scheduler-ci-4580-0-0-p-be73a47b79\" (UID: \"a4df0e6682ad774e865e367503d900a5\") " pod="kube-system/kube-scheduler-ci-4580-0-0-p-be73a47b79" Jan 16 21:16:55.955352 kubelet[2882]: I0116 21:16:55.955211 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/66a2eebef0db821f7955b8246b62ff17-ca-certs\") pod \"kube-apiserver-ci-4580-0-0-p-be73a47b79\" (UID: \"66a2eebef0db821f7955b8246b62ff17\") " pod="kube-system/kube-apiserver-ci-4580-0-0-p-be73a47b79" Jan 16 21:16:55.955352 kubelet[2882]: I0116 21:16:55.955225 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/6f16329486c6643b17ffc1b877c3ba28-k8s-certs\") pod \"kube-controller-manager-ci-4580-0-0-p-be73a47b79\" (UID: \"6f16329486c6643b17ffc1b877c3ba28\") " pod="kube-system/kube-controller-manager-ci-4580-0-0-p-be73a47b79" Jan 16 21:16:55.955352 kubelet[2882]: I0116 21:16:55.955239 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/6f16329486c6643b17ffc1b877c3ba28-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4580-0-0-p-be73a47b79\" (UID: \"6f16329486c6643b17ffc1b877c3ba28\") " pod="kube-system/kube-controller-manager-ci-4580-0-0-p-be73a47b79" Jan 16 21:16:55.955459 kubelet[2882]: I0116 21:16:55.955254 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/6f16329486c6643b17ffc1b877c3ba28-ca-certs\") pod \"kube-controller-manager-ci-4580-0-0-p-be73a47b79\" (UID: \"6f16329486c6643b17ffc1b877c3ba28\") " pod="kube-system/kube-controller-manager-ci-4580-0-0-p-be73a47b79" Jan 16 21:16:56.638240 kubelet[2882]: I0116 21:16:56.638203 2882 apiserver.go:52] "Watching apiserver" Jan 16 21:16:56.654731 kubelet[2882]: I0116 21:16:56.654677 2882 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Jan 16 21:16:56.695739 kubelet[2882]: I0116 21:16:56.695414 2882 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4580-0-0-p-be73a47b79" Jan 16 21:16:56.705843 kubelet[2882]: E0116 21:16:56.705484 2882 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4580-0-0-p-be73a47b79\" already exists" pod="kube-system/kube-apiserver-ci-4580-0-0-p-be73a47b79" Jan 16 21:16:56.727390 kubelet[2882]: I0116 21:16:56.727250 2882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4580-0-0-p-be73a47b79" podStartSLOduration=1.727228491 podStartE2EDuration="1.727228491s" podCreationTimestamp="2026-01-16 21:16:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-16 21:16:56.718231268 +0000 UTC m=+1.153170147" watchObservedRunningTime="2026-01-16 21:16:56.727228491 +0000 UTC m=+1.162167350" Jan 16 21:16:56.739269 kubelet[2882]: I0116 21:16:56.738994 2882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4580-0-0-p-be73a47b79" podStartSLOduration=1.738973415 podStartE2EDuration="1.738973415s" podCreationTimestamp="2026-01-16 21:16:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-16 21:16:56.727580703 +0000 UTC m=+1.162519563" watchObservedRunningTime="2026-01-16 21:16:56.738973415 +0000 UTC m=+1.173912293" Jan 16 21:16:56.750073 kubelet[2882]: I0116 21:16:56.750023 2882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4580-0-0-p-be73a47b79" podStartSLOduration=1.7500035729999999 podStartE2EDuration="1.750003573s" podCreationTimestamp="2026-01-16 21:16:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-16 21:16:56.739714836 +0000 UTC m=+1.174653717" watchObservedRunningTime="2026-01-16 21:16:56.750003573 +0000 UTC m=+1.184942433" Jan 16 21:16:58.994067 update_engine[1649]: I20260116 21:16:58.993655 1649 update_attempter.cc:509] Updating boot flags... Jan 16 21:17:00.901706 kubelet[2882]: I0116 21:17:00.901668 2882 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Jan 16 21:17:00.902179 containerd[1678]: time="2026-01-16T21:17:00.902139634Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Jan 16 21:17:00.902427 kubelet[2882]: I0116 21:17:00.902361 2882 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Jan 16 21:17:01.823239 systemd[1]: Created slice kubepods-besteffort-podfbe326b7_5b5e_4537_a15b_2e25fc9b1306.slice - libcontainer container kubepods-besteffort-podfbe326b7_5b5e_4537_a15b_2e25fc9b1306.slice. Jan 16 21:17:01.829099 kubelet[2882]: I0116 21:17:01.829063 2882 status_manager.go:890] "Failed to get status for pod" podUID="fbe326b7-5b5e-4537-a15b-2e25fc9b1306" pod="kube-system/kube-proxy-9dtk9" err="pods \"kube-proxy-9dtk9\" is forbidden: User \"system:node:ci-4580-0-0-p-be73a47b79\" cannot get resource \"pods\" in API group \"\" in the namespace \"kube-system\": no relationship found between node 'ci-4580-0-0-p-be73a47b79' and this object" Jan 16 21:17:01.893448 kubelet[2882]: I0116 21:17:01.893299 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/fbe326b7-5b5e-4537-a15b-2e25fc9b1306-kube-proxy\") pod \"kube-proxy-9dtk9\" (UID: \"fbe326b7-5b5e-4537-a15b-2e25fc9b1306\") " pod="kube-system/kube-proxy-9dtk9" Jan 16 21:17:01.893448 kubelet[2882]: I0116 21:17:01.893351 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2tvj\" (UniqueName: \"kubernetes.io/projected/fbe326b7-5b5e-4537-a15b-2e25fc9b1306-kube-api-access-b2tvj\") pod \"kube-proxy-9dtk9\" (UID: \"fbe326b7-5b5e-4537-a15b-2e25fc9b1306\") " pod="kube-system/kube-proxy-9dtk9" Jan 16 21:17:01.893448 kubelet[2882]: I0116 21:17:01.893373 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/fbe326b7-5b5e-4537-a15b-2e25fc9b1306-xtables-lock\") pod \"kube-proxy-9dtk9\" (UID: \"fbe326b7-5b5e-4537-a15b-2e25fc9b1306\") " pod="kube-system/kube-proxy-9dtk9" Jan 16 21:17:01.893448 kubelet[2882]: I0116 21:17:01.893386 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fbe326b7-5b5e-4537-a15b-2e25fc9b1306-lib-modules\") pod \"kube-proxy-9dtk9\" (UID: \"fbe326b7-5b5e-4537-a15b-2e25fc9b1306\") " pod="kube-system/kube-proxy-9dtk9" Jan 16 21:17:01.946821 systemd[1]: Created slice kubepods-besteffort-pod36ce1787_3f3c_4c7e_9c41_70a6682f1982.slice - libcontainer container kubepods-besteffort-pod36ce1787_3f3c_4c7e_9c41_70a6682f1982.slice. Jan 16 21:17:01.994214 kubelet[2882]: I0116 21:17:01.994157 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/36ce1787-3f3c-4c7e-9c41-70a6682f1982-var-lib-calico\") pod \"tigera-operator-7dcd859c48-r82fq\" (UID: \"36ce1787-3f3c-4c7e-9c41-70a6682f1982\") " pod="tigera-operator/tigera-operator-7dcd859c48-r82fq" Jan 16 21:17:01.994579 kubelet[2882]: I0116 21:17:01.994265 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8bs7\" (UniqueName: \"kubernetes.io/projected/36ce1787-3f3c-4c7e-9c41-70a6682f1982-kube-api-access-b8bs7\") pod \"tigera-operator-7dcd859c48-r82fq\" (UID: \"36ce1787-3f3c-4c7e-9c41-70a6682f1982\") " pod="tigera-operator/tigera-operator-7dcd859c48-r82fq" Jan 16 21:17:02.133303 containerd[1678]: time="2026-01-16T21:17:02.133172914Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-9dtk9,Uid:fbe326b7-5b5e-4537-a15b-2e25fc9b1306,Namespace:kube-system,Attempt:0,}" Jan 16 21:17:02.163998 containerd[1678]: time="2026-01-16T21:17:02.163699793Z" level=info msg="connecting to shim 052442c181d99ef92444686b6837d1d6e2f6ab35dfe4e72cb822745e5857b14b" address="unix:///run/containerd/s/454973b2ac31fbb6a6b7c3a4ec360a565453073c4f646e663bec5dc899da5ecf" namespace=k8s.io protocol=ttrpc version=3 Jan 16 21:17:02.195038 systemd[1]: Started cri-containerd-052442c181d99ef92444686b6837d1d6e2f6ab35dfe4e72cb822745e5857b14b.scope - libcontainer container 052442c181d99ef92444686b6837d1d6e2f6ab35dfe4e72cb822745e5857b14b. Jan 16 21:17:02.203000 audit: BPF prog-id=133 op=LOAD Jan 16 21:17:02.206922 kernel: kauditd_printk_skb: 32 callbacks suppressed Jan 16 21:17:02.206990 kernel: audit: type=1334 audit(1768598222.203:441): prog-id=133 op=LOAD Jan 16 21:17:02.206000 audit: BPF prog-id=134 op=LOAD Jan 16 21:17:02.209689 kernel: audit: type=1334 audit(1768598222.206:442): prog-id=134 op=LOAD Jan 16 21:17:02.209737 kernel: audit: type=1300 audit(1768598222.206:442): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000138238 a2=98 a3=0 items=0 ppid=2951 pid=2963 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.206000 audit[2963]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000138238 a2=98 a3=0 items=0 ppid=2951 pid=2963 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.206000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3035323434326331383164393965663932343434363836623638333764 Jan 16 21:17:02.218034 kernel: audit: type=1327 audit(1768598222.206:442): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3035323434326331383164393965663932343434363836623638333764 Jan 16 21:17:02.218126 kernel: audit: type=1334 audit(1768598222.206:443): prog-id=134 op=UNLOAD Jan 16 21:17:02.206000 audit: BPF prog-id=134 op=UNLOAD Jan 16 21:17:02.206000 audit[2963]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2951 pid=2963 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.227659 kernel: audit: type=1300 audit(1768598222.206:443): arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2951 pid=2963 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.227758 kernel: audit: type=1327 audit(1768598222.206:443): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3035323434326331383164393965663932343434363836623638333764 Jan 16 21:17:02.206000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3035323434326331383164393965663932343434363836623638333764 Jan 16 21:17:02.232694 kernel: audit: type=1334 audit(1768598222.206:444): prog-id=135 op=LOAD Jan 16 21:17:02.232791 kernel: audit: type=1300 audit(1768598222.206:444): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000138488 a2=98 a3=0 items=0 ppid=2951 pid=2963 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.206000 audit: BPF prog-id=135 op=LOAD Jan 16 21:17:02.206000 audit[2963]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000138488 a2=98 a3=0 items=0 ppid=2951 pid=2963 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.206000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3035323434326331383164393965663932343434363836623638333764 Jan 16 21:17:02.236940 kernel: audit: type=1327 audit(1768598222.206:444): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3035323434326331383164393965663932343434363836623638333764 Jan 16 21:17:02.206000 audit: BPF prog-id=136 op=LOAD Jan 16 21:17:02.206000 audit[2963]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000138218 a2=98 a3=0 items=0 ppid=2951 pid=2963 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.206000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3035323434326331383164393965663932343434363836623638333764 Jan 16 21:17:02.206000 audit: BPF prog-id=136 op=UNLOAD Jan 16 21:17:02.206000 audit[2963]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2951 pid=2963 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.206000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3035323434326331383164393965663932343434363836623638333764 Jan 16 21:17:02.206000 audit: BPF prog-id=135 op=UNLOAD Jan 16 21:17:02.206000 audit[2963]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2951 pid=2963 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.206000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3035323434326331383164393965663932343434363836623638333764 Jan 16 21:17:02.206000 audit: BPF prog-id=137 op=LOAD Jan 16 21:17:02.206000 audit[2963]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001386e8 a2=98 a3=0 items=0 ppid=2951 pid=2963 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.206000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3035323434326331383164393965663932343434363836623638333764 Jan 16 21:17:02.247831 containerd[1678]: time="2026-01-16T21:17:02.247801432Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-9dtk9,Uid:fbe326b7-5b5e-4537-a15b-2e25fc9b1306,Namespace:kube-system,Attempt:0,} returns sandbox id \"052442c181d99ef92444686b6837d1d6e2f6ab35dfe4e72cb822745e5857b14b\"" Jan 16 21:17:02.254275 containerd[1678]: time="2026-01-16T21:17:02.253569749Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-r82fq,Uid:36ce1787-3f3c-4c7e-9c41-70a6682f1982,Namespace:tigera-operator,Attempt:0,}" Jan 16 21:17:02.255264 containerd[1678]: time="2026-01-16T21:17:02.255240886Z" level=info msg="CreateContainer within sandbox \"052442c181d99ef92444686b6837d1d6e2f6ab35dfe4e72cb822745e5857b14b\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Jan 16 21:17:02.272945 containerd[1678]: time="2026-01-16T21:17:02.272891012Z" level=info msg="Container 86fc30460518b08d49cc10b9f3e6ddda7ec0f0eb92abb18e39171b5f5abf600f: CDI devices from CRI Config.CDIDevices: []" Jan 16 21:17:02.290737 containerd[1678]: time="2026-01-16T21:17:02.290696594Z" level=info msg="CreateContainer within sandbox \"052442c181d99ef92444686b6837d1d6e2f6ab35dfe4e72cb822745e5857b14b\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"86fc30460518b08d49cc10b9f3e6ddda7ec0f0eb92abb18e39171b5f5abf600f\"" Jan 16 21:17:02.293043 containerd[1678]: time="2026-01-16T21:17:02.293006009Z" level=info msg="StartContainer for \"86fc30460518b08d49cc10b9f3e6ddda7ec0f0eb92abb18e39171b5f5abf600f\"" Jan 16 21:17:02.294257 containerd[1678]: time="2026-01-16T21:17:02.294143895Z" level=info msg="connecting to shim 86fc30460518b08d49cc10b9f3e6ddda7ec0f0eb92abb18e39171b5f5abf600f" address="unix:///run/containerd/s/454973b2ac31fbb6a6b7c3a4ec360a565453073c4f646e663bec5dc899da5ecf" protocol=ttrpc version=3 Jan 16 21:17:02.301546 containerd[1678]: time="2026-01-16T21:17:02.301473989Z" level=info msg="connecting to shim 3777d1e49e5078a3ed4f807d7d317f14c02fad1f46a1a8ba0f0accda6bf305fb" address="unix:///run/containerd/s/d948cc3455b16aa1f3f6482c30f256b8abd7a6ae36bab4b9af1e6ba2486619e6" namespace=k8s.io protocol=ttrpc version=3 Jan 16 21:17:02.315907 systemd[1]: Started cri-containerd-86fc30460518b08d49cc10b9f3e6ddda7ec0f0eb92abb18e39171b5f5abf600f.scope - libcontainer container 86fc30460518b08d49cc10b9f3e6ddda7ec0f0eb92abb18e39171b5f5abf600f. Jan 16 21:17:02.332806 systemd[1]: Started cri-containerd-3777d1e49e5078a3ed4f807d7d317f14c02fad1f46a1a8ba0f0accda6bf305fb.scope - libcontainer container 3777d1e49e5078a3ed4f807d7d317f14c02fad1f46a1a8ba0f0accda6bf305fb. Jan 16 21:17:02.342000 audit: BPF prog-id=138 op=LOAD Jan 16 21:17:02.343000 audit: BPF prog-id=139 op=LOAD Jan 16 21:17:02.343000 audit[3021]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=3004 pid=3021 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.343000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3337373764316534396535303738613365643466383037643764333137 Jan 16 21:17:02.343000 audit: BPF prog-id=139 op=UNLOAD Jan 16 21:17:02.343000 audit[3021]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3004 pid=3021 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.343000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3337373764316534396535303738613365643466383037643764333137 Jan 16 21:17:02.343000 audit: BPF prog-id=140 op=LOAD Jan 16 21:17:02.343000 audit[3021]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=3004 pid=3021 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.343000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3337373764316534396535303738613365643466383037643764333137 Jan 16 21:17:02.343000 audit: BPF prog-id=141 op=LOAD Jan 16 21:17:02.343000 audit[3021]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=3004 pid=3021 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.343000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3337373764316534396535303738613365643466383037643764333137 Jan 16 21:17:02.343000 audit: BPF prog-id=141 op=UNLOAD Jan 16 21:17:02.343000 audit[3021]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3004 pid=3021 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.343000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3337373764316534396535303738613365643466383037643764333137 Jan 16 21:17:02.343000 audit: BPF prog-id=140 op=UNLOAD Jan 16 21:17:02.343000 audit[3021]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3004 pid=3021 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.343000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3337373764316534396535303738613365643466383037643764333137 Jan 16 21:17:02.343000 audit: BPF prog-id=142 op=LOAD Jan 16 21:17:02.343000 audit[3021]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=3004 pid=3021 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.343000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3337373764316534396535303738613365643466383037643764333137 Jan 16 21:17:02.359000 audit: BPF prog-id=143 op=LOAD Jan 16 21:17:02.359000 audit[2991]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=2951 pid=2991 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.359000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3836666333303436303531386230386434396363313062396633653664 Jan 16 21:17:02.359000 audit: BPF prog-id=144 op=LOAD Jan 16 21:17:02.359000 audit[2991]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=2951 pid=2991 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.359000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3836666333303436303531386230386434396363313062396633653664 Jan 16 21:17:02.359000 audit: BPF prog-id=144 op=UNLOAD Jan 16 21:17:02.359000 audit[2991]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2951 pid=2991 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.359000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3836666333303436303531386230386434396363313062396633653664 Jan 16 21:17:02.359000 audit: BPF prog-id=143 op=UNLOAD Jan 16 21:17:02.359000 audit[2991]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2951 pid=2991 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.359000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3836666333303436303531386230386434396363313062396633653664 Jan 16 21:17:02.359000 audit: BPF prog-id=145 op=LOAD Jan 16 21:17:02.359000 audit[2991]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=2951 pid=2991 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.359000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3836666333303436303531386230386434396363313062396633653664 Jan 16 21:17:02.387842 containerd[1678]: time="2026-01-16T21:17:02.387621133Z" level=info msg="StartContainer for \"86fc30460518b08d49cc10b9f3e6ddda7ec0f0eb92abb18e39171b5f5abf600f\" returns successfully" Jan 16 21:17:02.395212 containerd[1678]: time="2026-01-16T21:17:02.395172082Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-r82fq,Uid:36ce1787-3f3c-4c7e-9c41-70a6682f1982,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"3777d1e49e5078a3ed4f807d7d317f14c02fad1f46a1a8ba0f0accda6bf305fb\"" Jan 16 21:17:02.396923 containerd[1678]: time="2026-01-16T21:17:02.396898837Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\"" Jan 16 21:17:02.492000 audit[3097]: NETFILTER_CFG table=mangle:54 family=2 entries=1 op=nft_register_chain pid=3097 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 21:17:02.492000 audit[3097]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffd536d6750 a2=0 a3=7ffd536d673c items=0 ppid=3028 pid=3097 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.492000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Jan 16 21:17:02.493000 audit[3099]: NETFILTER_CFG table=nat:55 family=2 entries=1 op=nft_register_chain pid=3099 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 21:17:02.493000 audit[3099]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc3660f640 a2=0 a3=7ffc3660f62c items=0 ppid=3028 pid=3099 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.493000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Jan 16 21:17:02.493000 audit[3100]: NETFILTER_CFG table=mangle:56 family=10 entries=1 op=nft_register_chain pid=3100 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 21:17:02.493000 audit[3100]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7fffa18643f0 a2=0 a3=7fffa18643dc items=0 ppid=3028 pid=3100 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.493000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Jan 16 21:17:02.495000 audit[3102]: NETFILTER_CFG table=nat:57 family=10 entries=1 op=nft_register_chain pid=3102 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 21:17:02.495000 audit[3102]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc602fc7b0 a2=0 a3=7ffc602fc79c items=0 ppid=3028 pid=3102 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.495000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Jan 16 21:17:02.496000 audit[3101]: NETFILTER_CFG table=filter:58 family=2 entries=1 op=nft_register_chain pid=3101 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 21:17:02.496000 audit[3101]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7fff08a0c8c0 a2=0 a3=7fff08a0c8ac items=0 ppid=3028 pid=3101 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.496000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Jan 16 21:17:02.500000 audit[3103]: NETFILTER_CFG table=filter:59 family=10 entries=1 op=nft_register_chain pid=3103 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 21:17:02.500000 audit[3103]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffff5e0dc00 a2=0 a3=7ffff5e0dbec items=0 ppid=3028 pid=3103 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.500000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Jan 16 21:17:02.601000 audit[3104]: NETFILTER_CFG table=filter:60 family=2 entries=1 op=nft_register_chain pid=3104 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 21:17:02.601000 audit[3104]: SYSCALL arch=c000003e syscall=46 success=yes exit=108 a0=3 a1=7fffd693f080 a2=0 a3=7fffd693f06c items=0 ppid=3028 pid=3104 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.601000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Jan 16 21:17:02.605000 audit[3106]: NETFILTER_CFG table=filter:61 family=2 entries=1 op=nft_register_rule pid=3106 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 21:17:02.605000 audit[3106]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7ffffd54bde0 a2=0 a3=7ffffd54bdcc items=0 ppid=3028 pid=3106 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.605000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276696365 Jan 16 21:17:02.609000 audit[3109]: NETFILTER_CFG table=filter:62 family=2 entries=1 op=nft_register_rule pid=3109 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 21:17:02.609000 audit[3109]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7ffce4ff5770 a2=0 a3=7ffce4ff575c items=0 ppid=3028 pid=3109 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.609000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669 Jan 16 21:17:02.610000 audit[3110]: NETFILTER_CFG table=filter:63 family=2 entries=1 op=nft_register_chain pid=3110 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 21:17:02.610000 audit[3110]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc6fc5ab20 a2=0 a3=7ffc6fc5ab0c items=0 ppid=3028 pid=3110 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.610000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Jan 16 21:17:02.613000 audit[3112]: NETFILTER_CFG table=filter:64 family=2 entries=1 op=nft_register_rule pid=3112 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 21:17:02.613000 audit[3112]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffefc6aa020 a2=0 a3=7ffefc6aa00c items=0 ppid=3028 pid=3112 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.613000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Jan 16 21:17:02.614000 audit[3113]: NETFILTER_CFG table=filter:65 family=2 entries=1 op=nft_register_chain pid=3113 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 21:17:02.614000 audit[3113]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffd1dac91a0 a2=0 a3=7ffd1dac918c items=0 ppid=3028 pid=3113 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.614000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Jan 16 21:17:02.617000 audit[3115]: NETFILTER_CFG table=filter:66 family=2 entries=1 op=nft_register_rule pid=3115 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 21:17:02.617000 audit[3115]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7ffe1a621840 a2=0 a3=7ffe1a62182c items=0 ppid=3028 pid=3115 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.617000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Jan 16 21:17:02.620000 audit[3118]: NETFILTER_CFG table=filter:67 family=2 entries=1 op=nft_register_rule pid=3118 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 21:17:02.620000 audit[3118]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7fffc8393820 a2=0 a3=7fffc839380c items=0 ppid=3028 pid=3118 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.620000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D53 Jan 16 21:17:02.622000 audit[3119]: NETFILTER_CFG table=filter:68 family=2 entries=1 op=nft_register_chain pid=3119 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 21:17:02.622000 audit[3119]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffdaa1d7250 a2=0 a3=7ffdaa1d723c items=0 ppid=3028 pid=3119 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.622000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Jan 16 21:17:02.625000 audit[3121]: NETFILTER_CFG table=filter:69 family=2 entries=1 op=nft_register_rule pid=3121 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 21:17:02.625000 audit[3121]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffe457209a0 a2=0 a3=7ffe4572098c items=0 ppid=3028 pid=3121 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.625000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Jan 16 21:17:02.628000 audit[3122]: NETFILTER_CFG table=filter:70 family=2 entries=1 op=nft_register_chain pid=3122 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 21:17:02.628000 audit[3122]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffdb4f73400 a2=0 a3=7ffdb4f733ec items=0 ppid=3028 pid=3122 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.628000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Jan 16 21:17:02.633000 audit[3124]: NETFILTER_CFG table=filter:71 family=2 entries=1 op=nft_register_rule pid=3124 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 21:17:02.633000 audit[3124]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffcf8aefd80 a2=0 a3=7ffcf8aefd6c items=0 ppid=3028 pid=3124 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.633000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Jan 16 21:17:02.638000 audit[3127]: NETFILTER_CFG table=filter:72 family=2 entries=1 op=nft_register_rule pid=3127 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 21:17:02.638000 audit[3127]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7fffed671d30 a2=0 a3=7fffed671d1c items=0 ppid=3028 pid=3127 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.638000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Jan 16 21:17:02.643000 audit[3130]: NETFILTER_CFG table=filter:73 family=2 entries=1 op=nft_register_rule pid=3130 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 21:17:02.643000 audit[3130]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffe92d94f40 a2=0 a3=7ffe92d94f2c items=0 ppid=3028 pid=3130 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.643000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Jan 16 21:17:02.644000 audit[3131]: NETFILTER_CFG table=nat:74 family=2 entries=1 op=nft_register_chain pid=3131 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 21:17:02.644000 audit[3131]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffed13cad50 a2=0 a3=7ffed13cad3c items=0 ppid=3028 pid=3131 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.644000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Jan 16 21:17:02.646000 audit[3133]: NETFILTER_CFG table=nat:75 family=2 entries=1 op=nft_register_rule pid=3133 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 21:17:02.646000 audit[3133]: SYSCALL arch=c000003e syscall=46 success=yes exit=524 a0=3 a1=7ffd0f594de0 a2=0 a3=7ffd0f594dcc items=0 ppid=3028 pid=3133 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.646000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 16 21:17:02.650000 audit[3136]: NETFILTER_CFG table=nat:76 family=2 entries=1 op=nft_register_rule pid=3136 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 21:17:02.650000 audit[3136]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7fffcbff6fc0 a2=0 a3=7fffcbff6fac items=0 ppid=3028 pid=3136 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.650000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 16 21:17:02.651000 audit[3137]: NETFILTER_CFG table=nat:77 family=2 entries=1 op=nft_register_chain pid=3137 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 21:17:02.651000 audit[3137]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff4c6bb5b0 a2=0 a3=7fff4c6bb59c items=0 ppid=3028 pid=3137 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.651000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Jan 16 21:17:02.653000 audit[3139]: NETFILTER_CFG table=nat:78 family=2 entries=1 op=nft_register_rule pid=3139 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 21:17:02.653000 audit[3139]: SYSCALL arch=c000003e syscall=46 success=yes exit=532 a0=3 a1=7fff73977800 a2=0 a3=7fff739777ec items=0 ppid=3028 pid=3139 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.653000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Jan 16 21:17:02.680000 audit[3145]: NETFILTER_CFG table=filter:79 family=2 entries=8 op=nft_register_rule pid=3145 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 21:17:02.680000 audit[3145]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7ffd1dbc9740 a2=0 a3=7ffd1dbc972c items=0 ppid=3028 pid=3145 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.680000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 21:17:02.693000 audit[3145]: NETFILTER_CFG table=nat:80 family=2 entries=14 op=nft_register_chain pid=3145 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 21:17:02.693000 audit[3145]: SYSCALL arch=c000003e syscall=46 success=yes exit=5508 a0=3 a1=7ffd1dbc9740 a2=0 a3=7ffd1dbc972c items=0 ppid=3028 pid=3145 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.693000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 21:17:02.695000 audit[3150]: NETFILTER_CFG table=filter:81 family=10 entries=1 op=nft_register_chain pid=3150 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 21:17:02.695000 audit[3150]: SYSCALL arch=c000003e syscall=46 success=yes exit=108 a0=3 a1=7fff851fa2a0 a2=0 a3=7fff851fa28c items=0 ppid=3028 pid=3150 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.695000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Jan 16 21:17:02.698000 audit[3152]: NETFILTER_CFG table=filter:82 family=10 entries=2 op=nft_register_chain pid=3152 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 21:17:02.698000 audit[3152]: SYSCALL arch=c000003e syscall=46 success=yes exit=836 a0=3 a1=7fffe1c3aed0 a2=0 a3=7fffe1c3aebc items=0 ppid=3028 pid=3152 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.698000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C6520736572766963 Jan 16 21:17:02.701000 audit[3155]: NETFILTER_CFG table=filter:83 family=10 entries=1 op=nft_register_rule pid=3155 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 21:17:02.701000 audit[3155]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7fffbb28f880 a2=0 a3=7fffbb28f86c items=0 ppid=3028 pid=3155 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.701000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276 Jan 16 21:17:02.702000 audit[3156]: NETFILTER_CFG table=filter:84 family=10 entries=1 op=nft_register_chain pid=3156 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 21:17:02.702000 audit[3156]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc03961dd0 a2=0 a3=7ffc03961dbc items=0 ppid=3028 pid=3156 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.702000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Jan 16 21:17:02.705000 audit[3158]: NETFILTER_CFG table=filter:85 family=10 entries=1 op=nft_register_rule pid=3158 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 21:17:02.705000 audit[3158]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffcc795aeb0 a2=0 a3=7ffcc795ae9c items=0 ppid=3028 pid=3158 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.705000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Jan 16 21:17:02.707000 audit[3159]: NETFILTER_CFG table=filter:86 family=10 entries=1 op=nft_register_chain pid=3159 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 21:17:02.707000 audit[3159]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff371f80d0 a2=0 a3=7fff371f80bc items=0 ppid=3028 pid=3159 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.707000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Jan 16 21:17:02.709000 audit[3161]: NETFILTER_CFG table=filter:87 family=10 entries=1 op=nft_register_rule pid=3161 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 21:17:02.709000 audit[3161]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7ffe26806c00 a2=0 a3=7ffe26806bec items=0 ppid=3028 pid=3161 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.709000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B554245 Jan 16 21:17:02.718000 audit[3164]: NETFILTER_CFG table=filter:88 family=10 entries=2 op=nft_register_chain pid=3164 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 21:17:02.718000 audit[3164]: SYSCALL arch=c000003e syscall=46 success=yes exit=828 a0=3 a1=7ffdf6370750 a2=0 a3=7ffdf637073c items=0 ppid=3028 pid=3164 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.718000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Jan 16 21:17:02.719000 audit[3165]: NETFILTER_CFG table=filter:89 family=10 entries=1 op=nft_register_chain pid=3165 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 21:17:02.719000 audit[3165]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff551ae890 a2=0 a3=7fff551ae87c items=0 ppid=3028 pid=3165 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.719000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Jan 16 21:17:02.723707 kubelet[2882]: I0116 21:17:02.723259 2882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-9dtk9" podStartSLOduration=1.723242866 podStartE2EDuration="1.723242866s" podCreationTimestamp="2026-01-16 21:17:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-16 21:17:02.723045723 +0000 UTC m=+7.157984602" watchObservedRunningTime="2026-01-16 21:17:02.723242866 +0000 UTC m=+7.158181745" Jan 16 21:17:02.727000 audit[3167]: NETFILTER_CFG table=filter:90 family=10 entries=1 op=nft_register_rule pid=3167 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 21:17:02.727000 audit[3167]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7fff872f2700 a2=0 a3=7fff872f26ec items=0 ppid=3028 pid=3167 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.727000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Jan 16 21:17:02.729000 audit[3168]: NETFILTER_CFG table=filter:91 family=10 entries=1 op=nft_register_chain pid=3168 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 21:17:02.729000 audit[3168]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffdea8ff060 a2=0 a3=7ffdea8ff04c items=0 ppid=3028 pid=3168 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.729000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Jan 16 21:17:02.733000 audit[3170]: NETFILTER_CFG table=filter:92 family=10 entries=1 op=nft_register_rule pid=3170 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 21:17:02.733000 audit[3170]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffdcb56f670 a2=0 a3=7ffdcb56f65c items=0 ppid=3028 pid=3170 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.733000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Jan 16 21:17:02.739000 audit[3173]: NETFILTER_CFG table=filter:93 family=10 entries=1 op=nft_register_rule pid=3173 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 21:17:02.739000 audit[3173]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffe42ff8320 a2=0 a3=7ffe42ff830c items=0 ppid=3028 pid=3173 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.739000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Jan 16 21:17:02.743000 audit[3176]: NETFILTER_CFG table=filter:94 family=10 entries=1 op=nft_register_rule pid=3176 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 21:17:02.743000 audit[3176]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7fff3f97b220 a2=0 a3=7fff3f97b20c items=0 ppid=3028 pid=3176 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.743000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C Jan 16 21:17:02.744000 audit[3177]: NETFILTER_CFG table=nat:95 family=10 entries=1 op=nft_register_chain pid=3177 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 21:17:02.744000 audit[3177]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffe51a283b0 a2=0 a3=7ffe51a2839c items=0 ppid=3028 pid=3177 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.744000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Jan 16 21:17:02.747000 audit[3179]: NETFILTER_CFG table=nat:96 family=10 entries=1 op=nft_register_rule pid=3179 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 21:17:02.747000 audit[3179]: SYSCALL arch=c000003e syscall=46 success=yes exit=524 a0=3 a1=7ffd7c254450 a2=0 a3=7ffd7c25443c items=0 ppid=3028 pid=3179 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.747000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 16 21:17:02.750000 audit[3182]: NETFILTER_CFG table=nat:97 family=10 entries=1 op=nft_register_rule pid=3182 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 21:17:02.750000 audit[3182]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffe09d99a20 a2=0 a3=7ffe09d99a0c items=0 ppid=3028 pid=3182 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.750000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 16 21:17:02.751000 audit[3183]: NETFILTER_CFG table=nat:98 family=10 entries=1 op=nft_register_chain pid=3183 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 21:17:02.751000 audit[3183]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffdb7e7b9b0 a2=0 a3=7ffdb7e7b99c items=0 ppid=3028 pid=3183 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.751000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Jan 16 21:17:02.754000 audit[3185]: NETFILTER_CFG table=nat:99 family=10 entries=2 op=nft_register_chain pid=3185 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 21:17:02.754000 audit[3185]: SYSCALL arch=c000003e syscall=46 success=yes exit=612 a0=3 a1=7ffe2fcd2a60 a2=0 a3=7ffe2fcd2a4c items=0 ppid=3028 pid=3185 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.754000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Jan 16 21:17:02.755000 audit[3186]: NETFILTER_CFG table=filter:100 family=10 entries=1 op=nft_register_chain pid=3186 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 21:17:02.755000 audit[3186]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffd6af5e360 a2=0 a3=7ffd6af5e34c items=0 ppid=3028 pid=3186 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.755000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Jan 16 21:17:02.758000 audit[3188]: NETFILTER_CFG table=filter:101 family=10 entries=1 op=nft_register_rule pid=3188 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 21:17:02.758000 audit[3188]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7ffca25af780 a2=0 a3=7ffca25af76c items=0 ppid=3028 pid=3188 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.758000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 16 21:17:02.761000 audit[3191]: NETFILTER_CFG table=filter:102 family=10 entries=1 op=nft_register_rule pid=3191 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 21:17:02.761000 audit[3191]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7ffd8a865780 a2=0 a3=7ffd8a86576c items=0 ppid=3028 pid=3191 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.761000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 16 21:17:02.766000 audit[3193]: NETFILTER_CFG table=filter:103 family=10 entries=3 op=nft_register_rule pid=3193 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Jan 16 21:17:02.766000 audit[3193]: SYSCALL arch=c000003e syscall=46 success=yes exit=2088 a0=3 a1=7fffccdb8510 a2=0 a3=7fffccdb84fc items=0 ppid=3028 pid=3193 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.766000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 21:17:02.767000 audit[3193]: NETFILTER_CFG table=nat:104 family=10 entries=7 op=nft_register_chain pid=3193 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Jan 16 21:17:02.767000 audit[3193]: SYSCALL arch=c000003e syscall=46 success=yes exit=2056 a0=3 a1=7fffccdb8510 a2=0 a3=7fffccdb84fc items=0 ppid=3028 pid=3193 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:02.767000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 21:17:03.006216 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount72349063.mount: Deactivated successfully. Jan 16 21:17:04.585703 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1806302273.mount: Deactivated successfully. Jan 16 21:17:05.432434 containerd[1678]: time="2026-01-16T21:17:05.432382304Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 21:17:05.434435 containerd[1678]: time="2026-01-16T21:17:05.434414071Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.7: active requests=0, bytes read=0" Jan 16 21:17:05.436026 containerd[1678]: time="2026-01-16T21:17:05.435992146Z" level=info msg="ImageCreate event name:\"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 21:17:05.439344 containerd[1678]: time="2026-01-16T21:17:05.438650457Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 21:17:05.439344 containerd[1678]: time="2026-01-16T21:17:05.439141467Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.7\" with image id \"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\", repo tag \"quay.io/tigera/operator:v1.38.7\", repo digest \"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\", size \"25057686\" in 3.042216615s" Jan 16 21:17:05.439344 containerd[1678]: time="2026-01-16T21:17:05.439160484Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\" returns image reference \"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\"" Jan 16 21:17:05.441876 containerd[1678]: time="2026-01-16T21:17:05.441857529Z" level=info msg="CreateContainer within sandbox \"3777d1e49e5078a3ed4f807d7d317f14c02fad1f46a1a8ba0f0accda6bf305fb\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Jan 16 21:17:05.453994 containerd[1678]: time="2026-01-16T21:17:05.453968982Z" level=info msg="Container 537f2e095ced960fb26bb2f9a49f7d461aec69f1e730fda4b7cb5b88faf3a24a: CDI devices from CRI Config.CDIDevices: []" Jan 16 21:17:05.455562 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount934111742.mount: Deactivated successfully. Jan 16 21:17:05.464805 containerd[1678]: time="2026-01-16T21:17:05.464773907Z" level=info msg="CreateContainer within sandbox \"3777d1e49e5078a3ed4f807d7d317f14c02fad1f46a1a8ba0f0accda6bf305fb\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"537f2e095ced960fb26bb2f9a49f7d461aec69f1e730fda4b7cb5b88faf3a24a\"" Jan 16 21:17:05.465575 containerd[1678]: time="2026-01-16T21:17:05.465551627Z" level=info msg="StartContainer for \"537f2e095ced960fb26bb2f9a49f7d461aec69f1e730fda4b7cb5b88faf3a24a\"" Jan 16 21:17:05.466292 containerd[1678]: time="2026-01-16T21:17:05.466269785Z" level=info msg="connecting to shim 537f2e095ced960fb26bb2f9a49f7d461aec69f1e730fda4b7cb5b88faf3a24a" address="unix:///run/containerd/s/d948cc3455b16aa1f3f6482c30f256b8abd7a6ae36bab4b9af1e6ba2486619e6" protocol=ttrpc version=3 Jan 16 21:17:05.486813 systemd[1]: Started cri-containerd-537f2e095ced960fb26bb2f9a49f7d461aec69f1e730fda4b7cb5b88faf3a24a.scope - libcontainer container 537f2e095ced960fb26bb2f9a49f7d461aec69f1e730fda4b7cb5b88faf3a24a. Jan 16 21:17:05.494000 audit: BPF prog-id=146 op=LOAD Jan 16 21:17:05.495000 audit: BPF prog-id=147 op=LOAD Jan 16 21:17:05.495000 audit[3202]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000138238 a2=98 a3=0 items=0 ppid=3004 pid=3202 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:05.495000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3533376632653039356365643936306662323662623266396134396637 Jan 16 21:17:05.495000 audit: BPF prog-id=147 op=UNLOAD Jan 16 21:17:05.495000 audit[3202]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3004 pid=3202 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:05.495000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3533376632653039356365643936306662323662623266396134396637 Jan 16 21:17:05.495000 audit: BPF prog-id=148 op=LOAD Jan 16 21:17:05.495000 audit[3202]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000138488 a2=98 a3=0 items=0 ppid=3004 pid=3202 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:05.495000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3533376632653039356365643936306662323662623266396134396637 Jan 16 21:17:05.495000 audit: BPF prog-id=149 op=LOAD Jan 16 21:17:05.495000 audit[3202]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000138218 a2=98 a3=0 items=0 ppid=3004 pid=3202 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:05.495000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3533376632653039356365643936306662323662623266396134396637 Jan 16 21:17:05.496000 audit: BPF prog-id=149 op=UNLOAD Jan 16 21:17:05.496000 audit[3202]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3004 pid=3202 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:05.496000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3533376632653039356365643936306662323662623266396134396637 Jan 16 21:17:05.496000 audit: BPF prog-id=148 op=UNLOAD Jan 16 21:17:05.496000 audit[3202]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3004 pid=3202 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:05.496000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3533376632653039356365643936306662323662623266396134396637 Jan 16 21:17:05.496000 audit: BPF prog-id=150 op=LOAD Jan 16 21:17:05.496000 audit[3202]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001386e8 a2=98 a3=0 items=0 ppid=3004 pid=3202 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:05.496000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3533376632653039356365643936306662323662623266396134396637 Jan 16 21:17:05.514096 containerd[1678]: time="2026-01-16T21:17:05.514054608Z" level=info msg="StartContainer for \"537f2e095ced960fb26bb2f9a49f7d461aec69f1e730fda4b7cb5b88faf3a24a\" returns successfully" Jan 16 21:17:05.747528 kubelet[2882]: I0116 21:17:05.747419 2882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-7dcd859c48-r82fq" podStartSLOduration=1.7041075220000002 podStartE2EDuration="4.747396123s" podCreationTimestamp="2026-01-16 21:17:01 +0000 UTC" firstStartedPulling="2026-01-16 21:17:02.396445158 +0000 UTC m=+6.831384016" lastFinishedPulling="2026-01-16 21:17:05.439733758 +0000 UTC m=+9.874672617" observedRunningTime="2026-01-16 21:17:05.730265199 +0000 UTC m=+10.165204077" watchObservedRunningTime="2026-01-16 21:17:05.747396123 +0000 UTC m=+10.182334999" Jan 16 21:17:10.971338 sudo[1942]: pam_unix(sudo:session): session closed for user root Jan 16 21:17:10.976662 kernel: kauditd_printk_skb: 224 callbacks suppressed Jan 16 21:17:10.976763 kernel: audit: type=1106 audit(1768598230.970:521): pid=1942 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 16 21:17:10.970000 audit[1942]: USER_END pid=1942 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 16 21:17:10.970000 audit[1942]: CRED_DISP pid=1942 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 16 21:17:10.980610 kernel: audit: type=1104 audit(1768598230.970:522): pid=1942 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 16 21:17:11.068532 sshd[1941]: Connection closed by 4.153.228.146 port 36076 Jan 16 21:17:11.070482 sshd-session[1937]: pam_unix(sshd:session): session closed for user core Jan 16 21:17:11.071000 audit[1937]: USER_END pid=1937 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:17:11.075147 systemd[1]: sshd@8-10.0.3.156:22-4.153.228.146:36076.service: Deactivated successfully. Jan 16 21:17:11.077172 kernel: audit: type=1106 audit(1768598231.071:523): pid=1937 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:17:11.078316 systemd[1]: session-10.scope: Deactivated successfully. Jan 16 21:17:11.078553 systemd[1]: session-10.scope: Consumed 4.015s CPU time, 229.2M memory peak. Jan 16 21:17:11.071000 audit[1937]: CRED_DISP pid=1937 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:17:11.073000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.0.3.156:22-4.153.228.146:36076 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:17:11.084849 kernel: audit: type=1104 audit(1768598231.071:524): pid=1937 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:17:11.084900 kernel: audit: type=1131 audit(1768598231.073:525): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.0.3.156:22-4.153.228.146:36076 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:17:11.087076 systemd-logind[1647]: Session 10 logged out. Waiting for processes to exit. Jan 16 21:17:11.088566 systemd-logind[1647]: Removed session 10. Jan 16 21:17:11.755000 audit[3287]: NETFILTER_CFG table=filter:105 family=2 entries=15 op=nft_register_rule pid=3287 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 21:17:11.763647 kernel: audit: type=1325 audit(1768598231.755:526): table=filter:105 family=2 entries=15 op=nft_register_rule pid=3287 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 21:17:11.763744 kernel: audit: type=1300 audit(1768598231.755:526): arch=c000003e syscall=46 success=yes exit=5992 a0=3 a1=7ffcc8d6b440 a2=0 a3=7ffcc8d6b42c items=0 ppid=3028 pid=3287 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:11.755000 audit[3287]: SYSCALL arch=c000003e syscall=46 success=yes exit=5992 a0=3 a1=7ffcc8d6b440 a2=0 a3=7ffcc8d6b42c items=0 ppid=3028 pid=3287 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:11.755000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 21:17:11.768807 kernel: audit: type=1327 audit(1768598231.755:526): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 21:17:11.771000 audit[3287]: NETFILTER_CFG table=nat:106 family=2 entries=12 op=nft_register_rule pid=3287 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 21:17:11.775616 kernel: audit: type=1325 audit(1768598231.771:527): table=nat:106 family=2 entries=12 op=nft_register_rule pid=3287 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 21:17:11.771000 audit[3287]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffcc8d6b440 a2=0 a3=0 items=0 ppid=3028 pid=3287 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:11.782631 kernel: audit: type=1300 audit(1768598231.771:527): arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffcc8d6b440 a2=0 a3=0 items=0 ppid=3028 pid=3287 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:11.771000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 21:17:11.799000 audit[3289]: NETFILTER_CFG table=filter:107 family=2 entries=16 op=nft_register_rule pid=3289 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 21:17:11.799000 audit[3289]: SYSCALL arch=c000003e syscall=46 success=yes exit=5992 a0=3 a1=7fff5a601b00 a2=0 a3=7fff5a601aec items=0 ppid=3028 pid=3289 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:11.799000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 21:17:11.803000 audit[3289]: NETFILTER_CFG table=nat:108 family=2 entries=12 op=nft_register_rule pid=3289 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 21:17:11.803000 audit[3289]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7fff5a601b00 a2=0 a3=0 items=0 ppid=3028 pid=3289 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:11.803000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 21:17:13.951000 audit[3292]: NETFILTER_CFG table=filter:109 family=2 entries=16 op=nft_register_rule pid=3292 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 21:17:13.951000 audit[3292]: SYSCALL arch=c000003e syscall=46 success=yes exit=5992 a0=3 a1=7fff6ae4e0c0 a2=0 a3=7fff6ae4e0ac items=0 ppid=3028 pid=3292 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:13.951000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 21:17:13.957000 audit[3292]: NETFILTER_CFG table=nat:110 family=2 entries=12 op=nft_register_rule pid=3292 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 21:17:13.957000 audit[3292]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7fff6ae4e0c0 a2=0 a3=0 items=0 ppid=3028 pid=3292 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:13.957000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 21:17:13.976000 audit[3294]: NETFILTER_CFG table=filter:111 family=2 entries=17 op=nft_register_rule pid=3294 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 21:17:13.976000 audit[3294]: SYSCALL arch=c000003e syscall=46 success=yes exit=6736 a0=3 a1=7fffaf50eb70 a2=0 a3=7fffaf50eb5c items=0 ppid=3028 pid=3294 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:13.976000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 21:17:13.980000 audit[3294]: NETFILTER_CFG table=nat:112 family=2 entries=12 op=nft_register_rule pid=3294 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 21:17:13.980000 audit[3294]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7fffaf50eb70 a2=0 a3=0 items=0 ppid=3028 pid=3294 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:13.980000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 21:17:15.001000 audit[3296]: NETFILTER_CFG table=filter:113 family=2 entries=19 op=nft_register_rule pid=3296 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 21:17:15.001000 audit[3296]: SYSCALL arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7ffe6b478bb0 a2=0 a3=7ffe6b478b9c items=0 ppid=3028 pid=3296 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:15.001000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 21:17:15.006000 audit[3296]: NETFILTER_CFG table=nat:114 family=2 entries=12 op=nft_register_rule pid=3296 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 21:17:15.006000 audit[3296]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffe6b478bb0 a2=0 a3=0 items=0 ppid=3028 pid=3296 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:15.006000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 21:17:15.759872 systemd[1]: Created slice kubepods-besteffort-pod92b412ea_8b1b_4770_86ca_57495c6ad920.slice - libcontainer container kubepods-besteffort-pod92b412ea_8b1b_4770_86ca_57495c6ad920.slice. Jan 16 21:17:15.787819 kubelet[2882]: I0116 21:17:15.787782 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92b412ea-8b1b-4770-86ca-57495c6ad920-tigera-ca-bundle\") pod \"calico-typha-5b4d6fbcc9-dvxc5\" (UID: \"92b412ea-8b1b-4770-86ca-57495c6ad920\") " pod="calico-system/calico-typha-5b4d6fbcc9-dvxc5" Jan 16 21:17:15.787819 kubelet[2882]: I0116 21:17:15.787817 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/92b412ea-8b1b-4770-86ca-57495c6ad920-typha-certs\") pod \"calico-typha-5b4d6fbcc9-dvxc5\" (UID: \"92b412ea-8b1b-4770-86ca-57495c6ad920\") " pod="calico-system/calico-typha-5b4d6fbcc9-dvxc5" Jan 16 21:17:15.788147 kubelet[2882]: I0116 21:17:15.787835 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsvc8\" (UniqueName: \"kubernetes.io/projected/92b412ea-8b1b-4770-86ca-57495c6ad920-kube-api-access-vsvc8\") pod \"calico-typha-5b4d6fbcc9-dvxc5\" (UID: \"92b412ea-8b1b-4770-86ca-57495c6ad920\") " pod="calico-system/calico-typha-5b4d6fbcc9-dvxc5" Jan 16 21:17:15.934947 systemd[1]: Created slice kubepods-besteffort-pod47005ba6_4c77_4144_ac17_99d9ea13d379.slice - libcontainer container kubepods-besteffort-pod47005ba6_4c77_4144_ac17_99d9ea13d379.slice. Jan 16 21:17:15.989528 kubelet[2882]: I0116 21:17:15.989370 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47005ba6-4c77-4144-ac17-99d9ea13d379-tigera-ca-bundle\") pod \"calico-node-9x4ft\" (UID: \"47005ba6-4c77-4144-ac17-99d9ea13d379\") " pod="calico-system/calico-node-9x4ft" Jan 16 21:17:15.989528 kubelet[2882]: I0116 21:17:15.989419 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/47005ba6-4c77-4144-ac17-99d9ea13d379-node-certs\") pod \"calico-node-9x4ft\" (UID: \"47005ba6-4c77-4144-ac17-99d9ea13d379\") " pod="calico-system/calico-node-9x4ft" Jan 16 21:17:15.989528 kubelet[2882]: I0116 21:17:15.989445 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xn29\" (UniqueName: \"kubernetes.io/projected/47005ba6-4c77-4144-ac17-99d9ea13d379-kube-api-access-2xn29\") pod \"calico-node-9x4ft\" (UID: \"47005ba6-4c77-4144-ac17-99d9ea13d379\") " pod="calico-system/calico-node-9x4ft" Jan 16 21:17:15.989528 kubelet[2882]: I0116 21:17:15.989463 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/47005ba6-4c77-4144-ac17-99d9ea13d379-var-lib-calico\") pod \"calico-node-9x4ft\" (UID: \"47005ba6-4c77-4144-ac17-99d9ea13d379\") " pod="calico-system/calico-node-9x4ft" Jan 16 21:17:15.989528 kubelet[2882]: I0116 21:17:15.989481 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/47005ba6-4c77-4144-ac17-99d9ea13d379-var-run-calico\") pod \"calico-node-9x4ft\" (UID: \"47005ba6-4c77-4144-ac17-99d9ea13d379\") " pod="calico-system/calico-node-9x4ft" Jan 16 21:17:15.989825 kubelet[2882]: I0116 21:17:15.989537 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/47005ba6-4c77-4144-ac17-99d9ea13d379-cni-bin-dir\") pod \"calico-node-9x4ft\" (UID: \"47005ba6-4c77-4144-ac17-99d9ea13d379\") " pod="calico-system/calico-node-9x4ft" Jan 16 21:17:15.989825 kubelet[2882]: I0116 21:17:15.989572 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/47005ba6-4c77-4144-ac17-99d9ea13d379-policysync\") pod \"calico-node-9x4ft\" (UID: \"47005ba6-4c77-4144-ac17-99d9ea13d379\") " pod="calico-system/calico-node-9x4ft" Jan 16 21:17:15.989825 kubelet[2882]: I0116 21:17:15.989590 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/47005ba6-4c77-4144-ac17-99d9ea13d379-cni-net-dir\") pod \"calico-node-9x4ft\" (UID: \"47005ba6-4c77-4144-ac17-99d9ea13d379\") " pod="calico-system/calico-node-9x4ft" Jan 16 21:17:15.989825 kubelet[2882]: I0116 21:17:15.989653 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/47005ba6-4c77-4144-ac17-99d9ea13d379-flexvol-driver-host\") pod \"calico-node-9x4ft\" (UID: \"47005ba6-4c77-4144-ac17-99d9ea13d379\") " pod="calico-system/calico-node-9x4ft" Jan 16 21:17:15.989825 kubelet[2882]: I0116 21:17:15.989671 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/47005ba6-4c77-4144-ac17-99d9ea13d379-lib-modules\") pod \"calico-node-9x4ft\" (UID: \"47005ba6-4c77-4144-ac17-99d9ea13d379\") " pod="calico-system/calico-node-9x4ft" Jan 16 21:17:15.989930 kubelet[2882]: I0116 21:17:15.989688 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/47005ba6-4c77-4144-ac17-99d9ea13d379-xtables-lock\") pod \"calico-node-9x4ft\" (UID: \"47005ba6-4c77-4144-ac17-99d9ea13d379\") " pod="calico-system/calico-node-9x4ft" Jan 16 21:17:15.989930 kubelet[2882]: I0116 21:17:15.989731 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/47005ba6-4c77-4144-ac17-99d9ea13d379-cni-log-dir\") pod \"calico-node-9x4ft\" (UID: \"47005ba6-4c77-4144-ac17-99d9ea13d379\") " pod="calico-system/calico-node-9x4ft" Jan 16 21:17:16.018000 audit[3301]: NETFILTER_CFG table=filter:115 family=2 entries=21 op=nft_register_rule pid=3301 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 21:17:16.020948 kernel: kauditd_printk_skb: 25 callbacks suppressed Jan 16 21:17:16.020997 kernel: audit: type=1325 audit(1768598236.018:536): table=filter:115 family=2 entries=21 op=nft_register_rule pid=3301 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 21:17:16.018000 audit[3301]: SYSCALL arch=c000003e syscall=46 success=yes exit=8224 a0=3 a1=7ffd96272840 a2=0 a3=7ffd9627282c items=0 ppid=3028 pid=3301 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:16.025567 kernel: audit: type=1300 audit(1768598236.018:536): arch=c000003e syscall=46 success=yes exit=8224 a0=3 a1=7ffd96272840 a2=0 a3=7ffd9627282c items=0 ppid=3028 pid=3301 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:16.018000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 21:17:16.024000 audit[3301]: NETFILTER_CFG table=nat:116 family=2 entries=12 op=nft_register_rule pid=3301 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 21:17:16.033716 kernel: audit: type=1327 audit(1768598236.018:536): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 21:17:16.033845 kernel: audit: type=1325 audit(1768598236.024:537): table=nat:116 family=2 entries=12 op=nft_register_rule pid=3301 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 21:17:16.024000 audit[3301]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffd96272840 a2=0 a3=0 items=0 ppid=3028 pid=3301 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:16.037407 kernel: audit: type=1300 audit(1768598236.024:537): arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffd96272840 a2=0 a3=0 items=0 ppid=3028 pid=3301 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:16.024000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 21:17:16.041276 kernel: audit: type=1327 audit(1768598236.024:537): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 21:17:16.071318 containerd[1678]: time="2026-01-16T21:17:16.071283465Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5b4d6fbcc9-dvxc5,Uid:92b412ea-8b1b-4770-86ca-57495c6ad920,Namespace:calico-system,Attempt:0,}" Jan 16 21:17:16.093910 kubelet[2882]: E0116 21:17:16.093675 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.093910 kubelet[2882]: W0116 21:17:16.093877 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.093910 kubelet[2882]: E0116 21:17:16.093911 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.096714 kubelet[2882]: E0116 21:17:16.096663 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.096714 kubelet[2882]: W0116 21:17:16.096684 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.097093 kubelet[2882]: E0116 21:17:16.096845 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.098484 kubelet[2882]: E0116 21:17:16.097822 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.098484 kubelet[2882]: W0116 21:17:16.097927 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.098484 kubelet[2882]: E0116 21:17:16.097946 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.100623 kubelet[2882]: E0116 21:17:16.098930 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.100623 kubelet[2882]: W0116 21:17:16.098962 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.100713 kubelet[2882]: E0116 21:17:16.100639 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.101626 kubelet[2882]: E0116 21:17:16.100972 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.101626 kubelet[2882]: W0116 21:17:16.100984 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.101626 kubelet[2882]: E0116 21:17:16.101143 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.102695 kubelet[2882]: E0116 21:17:16.102662 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.102695 kubelet[2882]: W0116 21:17:16.102675 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.102695 kubelet[2882]: E0116 21:17:16.102690 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.106759 kubelet[2882]: E0116 21:17:16.106741 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.110690 kubelet[2882]: W0116 21:17:16.110624 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.110973 kubelet[2882]: E0116 21:17:16.110711 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.110973 kubelet[2882]: E0116 21:17:16.110942 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.110973 kubelet[2882]: W0116 21:17:16.110949 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.111250 kubelet[2882]: E0116 21:17:16.111236 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.111377 kubelet[2882]: W0116 21:17:16.111338 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.111377 kubelet[2882]: E0116 21:17:16.111355 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.111377 kubelet[2882]: E0116 21:17:16.111374 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.117641 kubelet[2882]: E0116 21:17:16.116951 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.117641 kubelet[2882]: W0116 21:17:16.116982 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.117641 kubelet[2882]: E0116 21:17:16.116997 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.118525 containerd[1678]: time="2026-01-16T21:17:16.118492495Z" level=info msg="connecting to shim bb35461fc246234ea5c7fbc259a07fa1a600e6620032c059222216634fcb73f4" address="unix:///run/containerd/s/01a94323ed3428c8a123ae81de6fb6b4f18a8f124e0112a0c2f626085c226d13" namespace=k8s.io protocol=ttrpc version=3 Jan 16 21:17:16.134311 kubelet[2882]: E0116 21:17:16.133375 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kql5z" podUID="a7efe392-df52-4d70-9d66-17372a93751c" Jan 16 21:17:16.162818 systemd[1]: Started cri-containerd-bb35461fc246234ea5c7fbc259a07fa1a600e6620032c059222216634fcb73f4.scope - libcontainer container bb35461fc246234ea5c7fbc259a07fa1a600e6620032c059222216634fcb73f4. Jan 16 21:17:16.173000 audit: BPF prog-id=151 op=LOAD Jan 16 21:17:16.174974 kubelet[2882]: E0116 21:17:16.174855 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.174974 kubelet[2882]: W0116 21:17:16.174880 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.174974 kubelet[2882]: E0116 21:17:16.174901 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.175385 kubelet[2882]: E0116 21:17:16.175298 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.175385 kubelet[2882]: W0116 21:17:16.175307 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.175385 kubelet[2882]: E0116 21:17:16.175316 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.175582 kubelet[2882]: E0116 21:17:16.175557 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.175582 kubelet[2882]: W0116 21:17:16.175565 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.175749 kernel: audit: type=1334 audit(1768598236.173:538): prog-id=151 op=LOAD Jan 16 21:17:16.175782 kubelet[2882]: E0116 21:17:16.175617 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.175882 kubelet[2882]: E0116 21:17:16.175875 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.175925 kubelet[2882]: W0116 21:17:16.175919 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.176024 kubelet[2882]: E0116 21:17:16.175954 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.176108 kubelet[2882]: E0116 21:17:16.176102 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.174000 audit: BPF prog-id=152 op=LOAD Jan 16 21:17:16.176410 kubelet[2882]: W0116 21:17:16.176400 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.176451 kubelet[2882]: E0116 21:17:16.176444 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.174000 audit[3335]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a8238 a2=98 a3=0 items=0 ppid=3321 pid=3335 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:16.177988 kubelet[2882]: E0116 21:17:16.177573 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.177988 kubelet[2882]: W0116 21:17:16.177872 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.177988 kubelet[2882]: E0116 21:17:16.177908 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.178220 kubelet[2882]: E0116 21:17:16.178186 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.178220 kubelet[2882]: W0116 21:17:16.178194 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.178296 kubelet[2882]: E0116 21:17:16.178268 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.178529 kubelet[2882]: E0116 21:17:16.178468 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.178529 kubelet[2882]: W0116 21:17:16.178474 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.178529 kubelet[2882]: E0116 21:17:16.178482 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.179522 kernel: audit: type=1334 audit(1768598236.174:539): prog-id=152 op=LOAD Jan 16 21:17:16.179576 kernel: audit: type=1300 audit(1768598236.174:539): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a8238 a2=98 a3=0 items=0 ppid=3321 pid=3335 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:16.174000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262333534363166633234363233346561356337666263323539613037 Jan 16 21:17:16.182389 kubelet[2882]: E0116 21:17:16.182221 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.182389 kubelet[2882]: W0116 21:17:16.182233 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.182389 kubelet[2882]: E0116 21:17:16.182244 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.183895 kernel: audit: type=1327 audit(1768598236.174:539): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262333534363166633234363233346561356337666263323539613037 Jan 16 21:17:16.187052 kubelet[2882]: E0116 21:17:16.186962 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.188342 kubelet[2882]: W0116 21:17:16.188132 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.188342 kubelet[2882]: E0116 21:17:16.188154 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.174000 audit: BPF prog-id=152 op=UNLOAD Jan 16 21:17:16.174000 audit[3335]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3321 pid=3335 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:16.174000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262333534363166633234363233346561356337666263323539613037 Jan 16 21:17:16.175000 audit: BPF prog-id=153 op=LOAD Jan 16 21:17:16.175000 audit[3335]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a8488 a2=98 a3=0 items=0 ppid=3321 pid=3335 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:16.175000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262333534363166633234363233346561356337666263323539613037 Jan 16 21:17:16.175000 audit: BPF prog-id=154 op=LOAD Jan 16 21:17:16.175000 audit[3335]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001a8218 a2=98 a3=0 items=0 ppid=3321 pid=3335 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:16.175000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262333534363166633234363233346561356337666263323539613037 Jan 16 21:17:16.175000 audit: BPF prog-id=154 op=UNLOAD Jan 16 21:17:16.175000 audit[3335]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3321 pid=3335 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:16.175000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262333534363166633234363233346561356337666263323539613037 Jan 16 21:17:16.175000 audit: BPF prog-id=153 op=UNLOAD Jan 16 21:17:16.175000 audit[3335]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3321 pid=3335 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:16.175000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262333534363166633234363233346561356337666263323539613037 Jan 16 21:17:16.175000 audit: BPF prog-id=155 op=LOAD Jan 16 21:17:16.175000 audit[3335]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a86e8 a2=98 a3=0 items=0 ppid=3321 pid=3335 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:16.175000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262333534363166633234363233346561356337666263323539613037 Jan 16 21:17:16.189795 kubelet[2882]: E0116 21:17:16.189024 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.189795 kubelet[2882]: W0116 21:17:16.189034 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.189795 kubelet[2882]: E0116 21:17:16.189045 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.190919 kubelet[2882]: E0116 21:17:16.190195 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.190919 kubelet[2882]: W0116 21:17:16.190204 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.190919 kubelet[2882]: E0116 21:17:16.190215 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.190919 kubelet[2882]: E0116 21:17:16.190710 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.190919 kubelet[2882]: W0116 21:17:16.190722 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.190919 kubelet[2882]: E0116 21:17:16.190745 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.191628 kubelet[2882]: E0116 21:17:16.191355 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.191628 kubelet[2882]: W0116 21:17:16.191364 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.191628 kubelet[2882]: E0116 21:17:16.191374 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.191998 kubelet[2882]: E0116 21:17:16.191948 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.191998 kubelet[2882]: W0116 21:17:16.191958 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.191998 kubelet[2882]: E0116 21:17:16.191967 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.192239 kubelet[2882]: E0116 21:17:16.192193 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.192239 kubelet[2882]: W0116 21:17:16.192200 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.192239 kubelet[2882]: E0116 21:17:16.192207 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.194637 kubelet[2882]: E0116 21:17:16.194385 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.194637 kubelet[2882]: W0116 21:17:16.194398 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.194637 kubelet[2882]: E0116 21:17:16.194408 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.194916 kubelet[2882]: E0116 21:17:16.194904 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.194966 kubelet[2882]: W0116 21:17:16.194953 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.195017 kubelet[2882]: E0116 21:17:16.195007 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.195203 kubelet[2882]: E0116 21:17:16.195196 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.195260 kubelet[2882]: W0116 21:17:16.195237 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.195297 kubelet[2882]: E0116 21:17:16.195290 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.195546 kubelet[2882]: E0116 21:17:16.195481 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.195546 kubelet[2882]: W0116 21:17:16.195492 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.195546 kubelet[2882]: E0116 21:17:16.195499 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.197326 kubelet[2882]: E0116 21:17:16.197083 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.197326 kubelet[2882]: W0116 21:17:16.197094 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.197326 kubelet[2882]: E0116 21:17:16.197104 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.197326 kubelet[2882]: I0116 21:17:16.197126 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a7efe392-df52-4d70-9d66-17372a93751c-registration-dir\") pod \"csi-node-driver-kql5z\" (UID: \"a7efe392-df52-4d70-9d66-17372a93751c\") " pod="calico-system/csi-node-driver-kql5z" Jan 16 21:17:16.200293 kubelet[2882]: E0116 21:17:16.200278 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.200455 kubelet[2882]: W0116 21:17:16.200442 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.200505 kubelet[2882]: E0116 21:17:16.200499 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.200656 kubelet[2882]: I0116 21:17:16.200646 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a7efe392-df52-4d70-9d66-17372a93751c-kubelet-dir\") pod \"csi-node-driver-kql5z\" (UID: \"a7efe392-df52-4d70-9d66-17372a93751c\") " pod="calico-system/csi-node-driver-kql5z" Jan 16 21:17:16.202907 kubelet[2882]: E0116 21:17:16.202895 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.203069 kubelet[2882]: W0116 21:17:16.202954 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.203069 kubelet[2882]: E0116 21:17:16.202967 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.203069 kubelet[2882]: I0116 21:17:16.202992 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a7efe392-df52-4d70-9d66-17372a93751c-socket-dir\") pod \"csi-node-driver-kql5z\" (UID: \"a7efe392-df52-4d70-9d66-17372a93751c\") " pod="calico-system/csi-node-driver-kql5z" Jan 16 21:17:16.203698 kubelet[2882]: E0116 21:17:16.203686 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.203857 kubelet[2882]: W0116 21:17:16.203848 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.203902 kubelet[2882]: E0116 21:17:16.203895 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.204008 kubelet[2882]: I0116 21:17:16.203951 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wg29\" (UniqueName: \"kubernetes.io/projected/a7efe392-df52-4d70-9d66-17372a93751c-kube-api-access-8wg29\") pod \"csi-node-driver-kql5z\" (UID: \"a7efe392-df52-4d70-9d66-17372a93751c\") " pod="calico-system/csi-node-driver-kql5z" Jan 16 21:17:16.204879 kubelet[2882]: E0116 21:17:16.204792 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.206000 kubelet[2882]: W0116 21:17:16.204942 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.206000 kubelet[2882]: E0116 21:17:16.204955 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.206000 kubelet[2882]: I0116 21:17:16.204973 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/a7efe392-df52-4d70-9d66-17372a93751c-varrun\") pod \"csi-node-driver-kql5z\" (UID: \"a7efe392-df52-4d70-9d66-17372a93751c\") " pod="calico-system/csi-node-driver-kql5z" Jan 16 21:17:16.210065 kubelet[2882]: E0116 21:17:16.210048 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.210065 kubelet[2882]: W0116 21:17:16.210064 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.210155 kubelet[2882]: E0116 21:17:16.210078 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.211741 kubelet[2882]: E0116 21:17:16.211727 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.211741 kubelet[2882]: W0116 21:17:16.211739 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.211823 kubelet[2882]: E0116 21:17:16.211757 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.211903 kubelet[2882]: E0116 21:17:16.211895 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.211953 kubelet[2882]: W0116 21:17:16.211904 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.211953 kubelet[2882]: E0116 21:17:16.211915 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.212029 kubelet[2882]: E0116 21:17:16.212021 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.212051 kubelet[2882]: W0116 21:17:16.212028 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.212126 kubelet[2882]: E0116 21:17:16.212068 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.212171 kubelet[2882]: E0116 21:17:16.212128 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.212171 kubelet[2882]: W0116 21:17:16.212133 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.212171 kubelet[2882]: E0116 21:17:16.212149 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.212252 kubelet[2882]: E0116 21:17:16.212242 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.212252 kubelet[2882]: W0116 21:17:16.212250 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.212319 kubelet[2882]: E0116 21:17:16.212264 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.212491 kubelet[2882]: E0116 21:17:16.212349 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.212491 kubelet[2882]: W0116 21:17:16.212354 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.212491 kubelet[2882]: E0116 21:17:16.212370 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.212491 kubelet[2882]: E0116 21:17:16.212450 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.212491 kubelet[2882]: W0116 21:17:16.212455 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.212491 kubelet[2882]: E0116 21:17:16.212461 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.212659 kubelet[2882]: E0116 21:17:16.212568 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.212659 kubelet[2882]: W0116 21:17:16.212573 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.212659 kubelet[2882]: E0116 21:17:16.212579 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.212795 kubelet[2882]: E0116 21:17:16.212699 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.212795 kubelet[2882]: W0116 21:17:16.212704 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.212795 kubelet[2882]: E0116 21:17:16.212710 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.240245 containerd[1678]: time="2026-01-16T21:17:16.240194180Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-9x4ft,Uid:47005ba6-4c77-4144-ac17-99d9ea13d379,Namespace:calico-system,Attempt:0,}" Jan 16 21:17:16.260535 containerd[1678]: time="2026-01-16T21:17:16.260430130Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5b4d6fbcc9-dvxc5,Uid:92b412ea-8b1b-4770-86ca-57495c6ad920,Namespace:calico-system,Attempt:0,} returns sandbox id \"bb35461fc246234ea5c7fbc259a07fa1a600e6620032c059222216634fcb73f4\"" Jan 16 21:17:16.263849 containerd[1678]: time="2026-01-16T21:17:16.263827664Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\"" Jan 16 21:17:16.276942 containerd[1678]: time="2026-01-16T21:17:16.276847008Z" level=info msg="connecting to shim 169fe14b803ed30cd4b9b421ceab557ce4cce0841b625ea39f026a5bc3a17fa1" address="unix:///run/containerd/s/9d64bde0cf708898bf107573483d6e74fbfe1a121bdd47462cbca4c68e57e7c5" namespace=k8s.io protocol=ttrpc version=3 Jan 16 21:17:16.305891 systemd[1]: Started cri-containerd-169fe14b803ed30cd4b9b421ceab557ce4cce0841b625ea39f026a5bc3a17fa1.scope - libcontainer container 169fe14b803ed30cd4b9b421ceab557ce4cce0841b625ea39f026a5bc3a17fa1. Jan 16 21:17:16.307269 kubelet[2882]: E0116 21:17:16.307209 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.307269 kubelet[2882]: W0116 21:17:16.307223 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.307269 kubelet[2882]: E0116 21:17:16.307250 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.307591 kubelet[2882]: E0116 21:17:16.307583 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.307661 kubelet[2882]: W0116 21:17:16.307654 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.307712 kubelet[2882]: E0116 21:17:16.307705 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.307919 kubelet[2882]: E0116 21:17:16.307906 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.307956 kubelet[2882]: W0116 21:17:16.307921 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.308034 kubelet[2882]: E0116 21:17:16.308011 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.308450 kubelet[2882]: E0116 21:17:16.308440 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.308482 kubelet[2882]: W0116 21:17:16.308450 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.308482 kubelet[2882]: E0116 21:17:16.308465 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.308743 kubelet[2882]: E0116 21:17:16.308734 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.308777 kubelet[2882]: W0116 21:17:16.308743 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.308777 kubelet[2882]: E0116 21:17:16.308755 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.309679 kubelet[2882]: E0116 21:17:16.309665 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.309679 kubelet[2882]: W0116 21:17:16.309677 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.309783 kubelet[2882]: E0116 21:17:16.309704 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.309835 kubelet[2882]: E0116 21:17:16.309827 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.309835 kubelet[2882]: W0116 21:17:16.309834 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.309909 kubelet[2882]: E0116 21:17:16.309868 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.309968 kubelet[2882]: E0116 21:17:16.309959 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.309968 kubelet[2882]: W0116 21:17:16.309967 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.310032 kubelet[2882]: E0116 21:17:16.309975 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.310147 kubelet[2882]: E0116 21:17:16.310137 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.310147 kubelet[2882]: W0116 21:17:16.310146 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.310202 kubelet[2882]: E0116 21:17:16.310157 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.310704 kubelet[2882]: E0116 21:17:16.310693 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.310737 kubelet[2882]: W0116 21:17:16.310704 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.310737 kubelet[2882]: E0116 21:17:16.310716 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.310992 kubelet[2882]: E0116 21:17:16.310935 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.310992 kubelet[2882]: W0116 21:17:16.310945 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.310992 kubelet[2882]: E0116 21:17:16.310960 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.311232 kubelet[2882]: E0116 21:17:16.311195 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.311232 kubelet[2882]: W0116 21:17:16.311202 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.311232 kubelet[2882]: E0116 21:17:16.311220 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.311446 kubelet[2882]: E0116 21:17:16.311411 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.311446 kubelet[2882]: W0116 21:17:16.311418 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.311672 kubelet[2882]: E0116 21:17:16.311631 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.311672 kubelet[2882]: E0116 21:17:16.311643 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.311672 kubelet[2882]: W0116 21:17:16.311649 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.311672 kubelet[2882]: E0116 21:17:16.311657 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.311825 kubelet[2882]: E0116 21:17:16.311815 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.311857 kubelet[2882]: W0116 21:17:16.311825 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.311857 kubelet[2882]: E0116 21:17:16.311837 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.312103 kubelet[2882]: E0116 21:17:16.312094 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.312103 kubelet[2882]: W0116 21:17:16.312103 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.312225 kubelet[2882]: E0116 21:17:16.312115 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.312370 kubelet[2882]: E0116 21:17:16.312361 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.312399 kubelet[2882]: W0116 21:17:16.312370 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.312399 kubelet[2882]: E0116 21:17:16.312381 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.312725 kubelet[2882]: E0116 21:17:16.312716 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.312725 kubelet[2882]: W0116 21:17:16.312725 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.312813 kubelet[2882]: E0116 21:17:16.312769 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.313023 kubelet[2882]: E0116 21:17:16.313014 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.313023 kubelet[2882]: W0116 21:17:16.313023 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.313104 kubelet[2882]: E0116 21:17:16.313062 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.313305 kubelet[2882]: E0116 21:17:16.313296 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.313305 kubelet[2882]: W0116 21:17:16.313305 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.313386 kubelet[2882]: E0116 21:17:16.313359 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.313497 kubelet[2882]: E0116 21:17:16.313489 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.313497 kubelet[2882]: W0116 21:17:16.313497 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.313545 kubelet[2882]: E0116 21:17:16.313508 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.314661 kubelet[2882]: E0116 21:17:16.314649 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.314661 kubelet[2882]: W0116 21:17:16.314659 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.314743 kubelet[2882]: E0116 21:17:16.314673 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.315022 kubelet[2882]: E0116 21:17:16.314997 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.315109 kubelet[2882]: W0116 21:17:16.315069 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.315109 kubelet[2882]: E0116 21:17:16.315081 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.315326 kubelet[2882]: E0116 21:17:16.315257 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.315326 kubelet[2882]: W0116 21:17:16.315264 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.315326 kubelet[2882]: E0116 21:17:16.315271 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.315491 kubelet[2882]: E0116 21:17:16.315486 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.315553 kubelet[2882]: W0116 21:17:16.315528 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.315553 kubelet[2882]: E0116 21:17:16.315537 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.319000 audit: BPF prog-id=156 op=LOAD Jan 16 21:17:16.320000 audit: BPF prog-id=157 op=LOAD Jan 16 21:17:16.320000 audit[3426]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a0238 a2=98 a3=0 items=0 ppid=3415 pid=3426 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:16.320000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3136396665313462383033656433306364346239623432316365616235 Jan 16 21:17:16.320000 audit: BPF prog-id=157 op=UNLOAD Jan 16 21:17:16.320000 audit[3426]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3415 pid=3426 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:16.320000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3136396665313462383033656433306364346239623432316365616235 Jan 16 21:17:16.320000 audit: BPF prog-id=158 op=LOAD Jan 16 21:17:16.320000 audit[3426]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a0488 a2=98 a3=0 items=0 ppid=3415 pid=3426 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:16.320000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3136396665313462383033656433306364346239623432316365616235 Jan 16 21:17:16.320000 audit: BPF prog-id=159 op=LOAD Jan 16 21:17:16.320000 audit[3426]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001a0218 a2=98 a3=0 items=0 ppid=3415 pid=3426 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:16.320000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3136396665313462383033656433306364346239623432316365616235 Jan 16 21:17:16.320000 audit: BPF prog-id=159 op=UNLOAD Jan 16 21:17:16.320000 audit[3426]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3415 pid=3426 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:16.320000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3136396665313462383033656433306364346239623432316365616235 Jan 16 21:17:16.320000 audit: BPF prog-id=158 op=UNLOAD Jan 16 21:17:16.320000 audit[3426]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3415 pid=3426 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:16.320000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3136396665313462383033656433306364346239623432316365616235 Jan 16 21:17:16.321000 audit: BPF prog-id=160 op=LOAD Jan 16 21:17:16.321000 audit[3426]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a06e8 a2=98 a3=0 items=0 ppid=3415 pid=3426 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:16.321000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3136396665313462383033656433306364346239623432316365616235 Jan 16 21:17:16.323339 kubelet[2882]: E0116 21:17:16.323314 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:16.323460 kubelet[2882]: W0116 21:17:16.323412 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:16.323460 kubelet[2882]: E0116 21:17:16.323428 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:16.338586 containerd[1678]: time="2026-01-16T21:17:16.338555600Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-9x4ft,Uid:47005ba6-4c77-4144-ac17-99d9ea13d379,Namespace:calico-system,Attempt:0,} returns sandbox id \"169fe14b803ed30cd4b9b421ceab557ce4cce0841b625ea39f026a5bc3a17fa1\"" Jan 16 21:17:17.647870 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1894849715.mount: Deactivated successfully. Jan 16 21:17:17.669802 kubelet[2882]: E0116 21:17:17.669762 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kql5z" podUID="a7efe392-df52-4d70-9d66-17372a93751c" Jan 16 21:17:18.164877 containerd[1678]: time="2026-01-16T21:17:18.164824589Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 21:17:18.167045 containerd[1678]: time="2026-01-16T21:17:18.167003401Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.4: active requests=0, bytes read=0" Jan 16 21:17:18.168858 containerd[1678]: time="2026-01-16T21:17:18.168816696Z" level=info msg="ImageCreate event name:\"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 21:17:18.172902 containerd[1678]: time="2026-01-16T21:17:18.172869016Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 21:17:18.173379 containerd[1678]: time="2026-01-16T21:17:18.173145760Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.4\" with image id \"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\", size \"35234482\" in 1.909186903s" Jan 16 21:17:18.173379 containerd[1678]: time="2026-01-16T21:17:18.173167228Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\" returns image reference \"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\"" Jan 16 21:17:18.174965 containerd[1678]: time="2026-01-16T21:17:18.174817128Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\"" Jan 16 21:17:18.183617 containerd[1678]: time="2026-01-16T21:17:18.183289598Z" level=info msg="CreateContainer within sandbox \"bb35461fc246234ea5c7fbc259a07fa1a600e6620032c059222216634fcb73f4\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Jan 16 21:17:18.193772 containerd[1678]: time="2026-01-16T21:17:18.193745528Z" level=info msg="Container 442cf536fe303ffd40c7f486792db127317f5f3f5dc6ac7dd98ed65c470c49e5: CDI devices from CRI Config.CDIDevices: []" Jan 16 21:17:18.207574 containerd[1678]: time="2026-01-16T21:17:18.207546521Z" level=info msg="CreateContainer within sandbox \"bb35461fc246234ea5c7fbc259a07fa1a600e6620032c059222216634fcb73f4\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"442cf536fe303ffd40c7f486792db127317f5f3f5dc6ac7dd98ed65c470c49e5\"" Jan 16 21:17:18.208449 containerd[1678]: time="2026-01-16T21:17:18.208429359Z" level=info msg="StartContainer for \"442cf536fe303ffd40c7f486792db127317f5f3f5dc6ac7dd98ed65c470c49e5\"" Jan 16 21:17:18.210214 containerd[1678]: time="2026-01-16T21:17:18.210187592Z" level=info msg="connecting to shim 442cf536fe303ffd40c7f486792db127317f5f3f5dc6ac7dd98ed65c470c49e5" address="unix:///run/containerd/s/01a94323ed3428c8a123ae81de6fb6b4f18a8f124e0112a0c2f626085c226d13" protocol=ttrpc version=3 Jan 16 21:17:18.231806 systemd[1]: Started cri-containerd-442cf536fe303ffd40c7f486792db127317f5f3f5dc6ac7dd98ed65c470c49e5.scope - libcontainer container 442cf536fe303ffd40c7f486792db127317f5f3f5dc6ac7dd98ed65c470c49e5. Jan 16 21:17:18.243000 audit: BPF prog-id=161 op=LOAD Jan 16 21:17:18.243000 audit: BPF prog-id=162 op=LOAD Jan 16 21:17:18.243000 audit[3489]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=3321 pid=3489 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:18.243000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3434326366353336666533303366666434306337663438363739326462 Jan 16 21:17:18.243000 audit: BPF prog-id=162 op=UNLOAD Jan 16 21:17:18.243000 audit[3489]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3321 pid=3489 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:18.243000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3434326366353336666533303366666434306337663438363739326462 Jan 16 21:17:18.243000 audit: BPF prog-id=163 op=LOAD Jan 16 21:17:18.243000 audit[3489]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=3321 pid=3489 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:18.243000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3434326366353336666533303366666434306337663438363739326462 Jan 16 21:17:18.243000 audit: BPF prog-id=164 op=LOAD Jan 16 21:17:18.243000 audit[3489]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=3321 pid=3489 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:18.243000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3434326366353336666533303366666434306337663438363739326462 Jan 16 21:17:18.243000 audit: BPF prog-id=164 op=UNLOAD Jan 16 21:17:18.243000 audit[3489]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3321 pid=3489 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:18.243000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3434326366353336666533303366666434306337663438363739326462 Jan 16 21:17:18.243000 audit: BPF prog-id=163 op=UNLOAD Jan 16 21:17:18.243000 audit[3489]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3321 pid=3489 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:18.243000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3434326366353336666533303366666434306337663438363739326462 Jan 16 21:17:18.243000 audit: BPF prog-id=165 op=LOAD Jan 16 21:17:18.243000 audit[3489]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=3321 pid=3489 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:18.243000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3434326366353336666533303366666434306337663438363739326462 Jan 16 21:17:18.292555 containerd[1678]: time="2026-01-16T21:17:18.292523294Z" level=info msg="StartContainer for \"442cf536fe303ffd40c7f486792db127317f5f3f5dc6ac7dd98ed65c470c49e5\" returns successfully" Jan 16 21:17:18.757463 kubelet[2882]: I0116 21:17:18.757396 2882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-5b4d6fbcc9-dvxc5" podStartSLOduration=1.845666547 podStartE2EDuration="3.757381782s" podCreationTimestamp="2026-01-16 21:17:15 +0000 UTC" firstStartedPulling="2026-01-16 21:17:16.262351273 +0000 UTC m=+20.697290130" lastFinishedPulling="2026-01-16 21:17:18.174066508 +0000 UTC m=+22.609005365" observedRunningTime="2026-01-16 21:17:18.757298931 +0000 UTC m=+23.192237790" watchObservedRunningTime="2026-01-16 21:17:18.757381782 +0000 UTC m=+23.192320662" Jan 16 21:17:18.810139 kubelet[2882]: E0116 21:17:18.810098 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:18.810139 kubelet[2882]: W0116 21:17:18.810120 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:18.810139 kubelet[2882]: E0116 21:17:18.810141 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:18.810429 kubelet[2882]: E0116 21:17:18.810278 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:18.810429 kubelet[2882]: W0116 21:17:18.810284 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:18.810429 kubelet[2882]: E0116 21:17:18.810291 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:18.810429 kubelet[2882]: E0116 21:17:18.810398 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:18.810429 kubelet[2882]: W0116 21:17:18.810403 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:18.810429 kubelet[2882]: E0116 21:17:18.810409 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:18.810620 kubelet[2882]: E0116 21:17:18.810511 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:18.810620 kubelet[2882]: W0116 21:17:18.810516 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:18.810620 kubelet[2882]: E0116 21:17:18.810523 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:18.810701 kubelet[2882]: E0116 21:17:18.810671 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:18.810701 kubelet[2882]: W0116 21:17:18.810676 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:18.810701 kubelet[2882]: E0116 21:17:18.810682 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:18.810792 kubelet[2882]: E0116 21:17:18.810783 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:18.810792 kubelet[2882]: W0116 21:17:18.810790 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:18.810849 kubelet[2882]: E0116 21:17:18.810797 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:18.810911 kubelet[2882]: E0116 21:17:18.810897 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:18.810911 kubelet[2882]: W0116 21:17:18.810905 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:18.810911 kubelet[2882]: E0116 21:17:18.810910 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:18.811021 kubelet[2882]: E0116 21:17:18.811010 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:18.811021 kubelet[2882]: W0116 21:17:18.811018 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:18.811068 kubelet[2882]: E0116 21:17:18.811023 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:18.811143 kubelet[2882]: E0116 21:17:18.811134 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:18.811143 kubelet[2882]: W0116 21:17:18.811142 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:18.811208 kubelet[2882]: E0116 21:17:18.811148 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:18.811252 kubelet[2882]: E0116 21:17:18.811244 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:18.811252 kubelet[2882]: W0116 21:17:18.811252 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:18.811304 kubelet[2882]: E0116 21:17:18.811257 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:18.811375 kubelet[2882]: E0116 21:17:18.811362 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:18.811375 kubelet[2882]: W0116 21:17:18.811370 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:18.811422 kubelet[2882]: E0116 21:17:18.811376 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:18.811484 kubelet[2882]: E0116 21:17:18.811475 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:18.811484 kubelet[2882]: W0116 21:17:18.811483 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:18.811539 kubelet[2882]: E0116 21:17:18.811489 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:18.811602 kubelet[2882]: E0116 21:17:18.811592 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:18.811624 kubelet[2882]: W0116 21:17:18.811607 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:18.811624 kubelet[2882]: E0116 21:17:18.811613 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:18.811727 kubelet[2882]: E0116 21:17:18.811717 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:18.811727 kubelet[2882]: W0116 21:17:18.811724 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:18.811768 kubelet[2882]: E0116 21:17:18.811730 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:18.811861 kubelet[2882]: E0116 21:17:18.811852 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:18.811861 kubelet[2882]: W0116 21:17:18.811860 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:18.811914 kubelet[2882]: E0116 21:17:18.811866 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:18.825367 kubelet[2882]: E0116 21:17:18.825300 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:18.825367 kubelet[2882]: W0116 21:17:18.825325 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:18.825367 kubelet[2882]: E0116 21:17:18.825343 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:18.825755 kubelet[2882]: E0116 21:17:18.825746 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:18.825895 kubelet[2882]: W0116 21:17:18.825790 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:18.825895 kubelet[2882]: E0116 21:17:18.825805 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:18.826180 kubelet[2882]: E0116 21:17:18.826062 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:18.826180 kubelet[2882]: W0116 21:17:18.826070 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:18.826180 kubelet[2882]: E0116 21:17:18.826079 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:18.826301 kubelet[2882]: E0116 21:17:18.826295 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:18.826344 kubelet[2882]: W0116 21:17:18.826338 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:18.826391 kubelet[2882]: E0116 21:17:18.826384 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:18.826584 kubelet[2882]: E0116 21:17:18.826567 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:18.826637 kubelet[2882]: W0116 21:17:18.826584 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:18.826637 kubelet[2882]: E0116 21:17:18.826614 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:18.826751 kubelet[2882]: E0116 21:17:18.826742 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:18.826775 kubelet[2882]: W0116 21:17:18.826750 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:18.826775 kubelet[2882]: E0116 21:17:18.826761 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:18.826873 kubelet[2882]: E0116 21:17:18.826866 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:18.826912 kubelet[2882]: W0116 21:17:18.826875 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:18.826912 kubelet[2882]: E0116 21:17:18.826885 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:18.827021 kubelet[2882]: E0116 21:17:18.827013 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:18.827021 kubelet[2882]: W0116 21:17:18.827020 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:18.827300 kubelet[2882]: E0116 21:17:18.827063 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:18.827300 kubelet[2882]: E0116 21:17:18.827146 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:18.827300 kubelet[2882]: W0116 21:17:18.827152 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:18.827300 kubelet[2882]: E0116 21:17:18.827249 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:18.827300 kubelet[2882]: W0116 21:17:18.827254 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:18.827300 kubelet[2882]: E0116 21:17:18.827260 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:18.827413 kubelet[2882]: E0116 21:17:18.827353 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:18.827413 kubelet[2882]: W0116 21:17:18.827358 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:18.827413 kubelet[2882]: E0116 21:17:18.827363 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:18.827622 kubelet[2882]: E0116 21:17:18.827472 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:18.827622 kubelet[2882]: W0116 21:17:18.827481 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:18.827622 kubelet[2882]: E0116 21:17:18.827486 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:18.827697 kubelet[2882]: E0116 21:17:18.827686 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:18.827872 kubelet[2882]: E0116 21:17:18.827864 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:18.827914 kubelet[2882]: W0116 21:17:18.827908 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:18.827961 kubelet[2882]: E0116 21:17:18.827954 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:18.828132 kubelet[2882]: E0116 21:17:18.828126 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:18.828236 kubelet[2882]: W0116 21:17:18.828180 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:18.828236 kubelet[2882]: E0116 21:17:18.828193 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:18.828551 kubelet[2882]: E0116 21:17:18.828405 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:18.828631 kubelet[2882]: W0116 21:17:18.828622 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:18.828720 kubelet[2882]: E0116 21:17:18.828663 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:18.828861 kubelet[2882]: E0116 21:17:18.828855 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:18.828896 kubelet[2882]: W0116 21:17:18.828891 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:18.828939 kubelet[2882]: E0116 21:17:18.828933 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:18.829239 kubelet[2882]: E0116 21:17:18.829103 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:18.829239 kubelet[2882]: W0116 21:17:18.829110 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:18.829239 kubelet[2882]: E0116 21:17:18.829116 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:18.829419 kubelet[2882]: E0116 21:17:18.829414 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 21:17:18.829548 kubelet[2882]: W0116 21:17:18.829540 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 21:17:18.829649 kubelet[2882]: E0116 21:17:18.829641 2882 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 21:17:19.484952 containerd[1678]: time="2026-01-16T21:17:19.484794744Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 21:17:19.486406 containerd[1678]: time="2026-01-16T21:17:19.486377189Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4: active requests=0, bytes read=0" Jan 16 21:17:19.488673 containerd[1678]: time="2026-01-16T21:17:19.488589719Z" level=info msg="ImageCreate event name:\"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 21:17:19.491302 containerd[1678]: time="2026-01-16T21:17:19.491255389Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 21:17:19.491986 containerd[1678]: time="2026-01-16T21:17:19.491853793Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" with image id \"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\", size \"5941314\" in 1.317010533s" Jan 16 21:17:19.491986 containerd[1678]: time="2026-01-16T21:17:19.491879928Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" returns image reference \"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\"" Jan 16 21:17:19.495412 containerd[1678]: time="2026-01-16T21:17:19.495123162Z" level=info msg="CreateContainer within sandbox \"169fe14b803ed30cd4b9b421ceab557ce4cce0841b625ea39f026a5bc3a17fa1\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Jan 16 21:17:19.508876 containerd[1678]: time="2026-01-16T21:17:19.505808809Z" level=info msg="Container a99e29e662a7bf1336c094f0726db42dd113ea2dac8c47b39cbb4a7fa20cf88b: CDI devices from CRI Config.CDIDevices: []" Jan 16 21:17:19.523455 containerd[1678]: time="2026-01-16T21:17:19.523400788Z" level=info msg="CreateContainer within sandbox \"169fe14b803ed30cd4b9b421ceab557ce4cce0841b625ea39f026a5bc3a17fa1\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"a99e29e662a7bf1336c094f0726db42dd113ea2dac8c47b39cbb4a7fa20cf88b\"" Jan 16 21:17:19.524945 containerd[1678]: time="2026-01-16T21:17:19.524801138Z" level=info msg="StartContainer for \"a99e29e662a7bf1336c094f0726db42dd113ea2dac8c47b39cbb4a7fa20cf88b\"" Jan 16 21:17:19.526721 containerd[1678]: time="2026-01-16T21:17:19.526685259Z" level=info msg="connecting to shim a99e29e662a7bf1336c094f0726db42dd113ea2dac8c47b39cbb4a7fa20cf88b" address="unix:///run/containerd/s/9d64bde0cf708898bf107573483d6e74fbfe1a121bdd47462cbca4c68e57e7c5" protocol=ttrpc version=3 Jan 16 21:17:19.548818 systemd[1]: Started cri-containerd-a99e29e662a7bf1336c094f0726db42dd113ea2dac8c47b39cbb4a7fa20cf88b.scope - libcontainer container a99e29e662a7bf1336c094f0726db42dd113ea2dac8c47b39cbb4a7fa20cf88b. Jan 16 21:17:19.597000 audit: BPF prog-id=166 op=LOAD Jan 16 21:17:19.597000 audit[3561]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a0488 a2=98 a3=0 items=0 ppid=3415 pid=3561 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:19.597000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6139396532396536363261376266313333366330393466303732366462 Jan 16 21:17:19.597000 audit: BPF prog-id=167 op=LOAD Jan 16 21:17:19.597000 audit[3561]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c0001a0218 a2=98 a3=0 items=0 ppid=3415 pid=3561 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:19.597000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6139396532396536363261376266313333366330393466303732366462 Jan 16 21:17:19.597000 audit: BPF prog-id=167 op=UNLOAD Jan 16 21:17:19.597000 audit[3561]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3415 pid=3561 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:19.597000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6139396532396536363261376266313333366330393466303732366462 Jan 16 21:17:19.597000 audit: BPF prog-id=166 op=UNLOAD Jan 16 21:17:19.597000 audit[3561]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3415 pid=3561 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:19.597000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6139396532396536363261376266313333366330393466303732366462 Jan 16 21:17:19.597000 audit: BPF prog-id=168 op=LOAD Jan 16 21:17:19.597000 audit[3561]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a06e8 a2=98 a3=0 items=0 ppid=3415 pid=3561 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:19.597000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6139396532396536363261376266313333366330393466303732366462 Jan 16 21:17:19.617751 containerd[1678]: time="2026-01-16T21:17:19.617705222Z" level=info msg="StartContainer for \"a99e29e662a7bf1336c094f0726db42dd113ea2dac8c47b39cbb4a7fa20cf88b\" returns successfully" Jan 16 21:17:19.625272 systemd[1]: cri-containerd-a99e29e662a7bf1336c094f0726db42dd113ea2dac8c47b39cbb4a7fa20cf88b.scope: Deactivated successfully. Jan 16 21:17:19.628000 audit: BPF prog-id=168 op=UNLOAD Jan 16 21:17:19.629621 containerd[1678]: time="2026-01-16T21:17:19.629528397Z" level=info msg="received container exit event container_id:\"a99e29e662a7bf1336c094f0726db42dd113ea2dac8c47b39cbb4a7fa20cf88b\" id:\"a99e29e662a7bf1336c094f0726db42dd113ea2dac8c47b39cbb4a7fa20cf88b\" pid:3575 exited_at:{seconds:1768598239 nanos:628505986}" Jan 16 21:17:19.648879 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-a99e29e662a7bf1336c094f0726db42dd113ea2dac8c47b39cbb4a7fa20cf88b-rootfs.mount: Deactivated successfully. Jan 16 21:17:19.672493 kubelet[2882]: E0116 21:17:19.672460 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kql5z" podUID="a7efe392-df52-4d70-9d66-17372a93751c" Jan 16 21:17:19.766405 kubelet[2882]: I0116 21:17:19.749185 2882 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 16 21:17:21.670154 kubelet[2882]: E0116 21:17:21.669846 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kql5z" podUID="a7efe392-df52-4d70-9d66-17372a93751c" Jan 16 21:17:21.755253 containerd[1678]: time="2026-01-16T21:17:21.755215060Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\"" Jan 16 21:17:23.675257 kubelet[2882]: E0116 21:17:23.674668 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kql5z" podUID="a7efe392-df52-4d70-9d66-17372a93751c" Jan 16 21:17:24.376172 containerd[1678]: time="2026-01-16T21:17:24.376131725Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 21:17:24.377548 containerd[1678]: time="2026-01-16T21:17:24.377524033Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.4: active requests=0, bytes read=70442291" Jan 16 21:17:24.379032 containerd[1678]: time="2026-01-16T21:17:24.378991708Z" level=info msg="ImageCreate event name:\"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 21:17:24.381501 containerd[1678]: time="2026-01-16T21:17:24.381453263Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 21:17:24.382181 containerd[1678]: time="2026-01-16T21:17:24.382162089Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.4\" with image id \"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\", size \"71941459\" in 2.626906712s" Jan 16 21:17:24.382314 containerd[1678]: time="2026-01-16T21:17:24.382248645Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\" returns image reference \"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\"" Jan 16 21:17:24.384339 containerd[1678]: time="2026-01-16T21:17:24.384019594Z" level=info msg="CreateContainer within sandbox \"169fe14b803ed30cd4b9b421ceab557ce4cce0841b625ea39f026a5bc3a17fa1\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Jan 16 21:17:24.397715 containerd[1678]: time="2026-01-16T21:17:24.397677487Z" level=info msg="Container 1ef468db80d3174af48135538310668fce322dc79ef77d21f70c7263c493fd82: CDI devices from CRI Config.CDIDevices: []" Jan 16 21:17:24.402237 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2402093006.mount: Deactivated successfully. Jan 16 21:17:24.410415 containerd[1678]: time="2026-01-16T21:17:24.410375290Z" level=info msg="CreateContainer within sandbox \"169fe14b803ed30cd4b9b421ceab557ce4cce0841b625ea39f026a5bc3a17fa1\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"1ef468db80d3174af48135538310668fce322dc79ef77d21f70c7263c493fd82\"" Jan 16 21:17:24.412627 containerd[1678]: time="2026-01-16T21:17:24.411981306Z" level=info msg="StartContainer for \"1ef468db80d3174af48135538310668fce322dc79ef77d21f70c7263c493fd82\"" Jan 16 21:17:24.413878 containerd[1678]: time="2026-01-16T21:17:24.413854245Z" level=info msg="connecting to shim 1ef468db80d3174af48135538310668fce322dc79ef77d21f70c7263c493fd82" address="unix:///run/containerd/s/9d64bde0cf708898bf107573483d6e74fbfe1a121bdd47462cbca4c68e57e7c5" protocol=ttrpc version=3 Jan 16 21:17:24.437808 systemd[1]: Started cri-containerd-1ef468db80d3174af48135538310668fce322dc79ef77d21f70c7263c493fd82.scope - libcontainer container 1ef468db80d3174af48135538310668fce322dc79ef77d21f70c7263c493fd82. Jan 16 21:17:24.492631 kernel: kauditd_printk_skb: 78 callbacks suppressed Jan 16 21:17:24.492921 kernel: audit: type=1334 audit(1768598244.489:568): prog-id=169 op=LOAD Jan 16 21:17:24.492951 kernel: audit: type=1300 audit(1768598244.489:568): arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a0488 a2=98 a3=0 items=0 ppid=3415 pid=3619 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:24.489000 audit: BPF prog-id=169 op=LOAD Jan 16 21:17:24.489000 audit[3619]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a0488 a2=98 a3=0 items=0 ppid=3415 pid=3619 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:24.489000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3165663436386462383064333137346166343831333535333833313036 Jan 16 21:17:24.499077 kernel: audit: type=1327 audit(1768598244.489:568): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3165663436386462383064333137346166343831333535333833313036 Jan 16 21:17:24.489000 audit: BPF prog-id=170 op=LOAD Jan 16 21:17:24.502009 kernel: audit: type=1334 audit(1768598244.489:569): prog-id=170 op=LOAD Jan 16 21:17:24.502647 kernel: audit: type=1300 audit(1768598244.489:569): arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c0001a0218 a2=98 a3=0 items=0 ppid=3415 pid=3619 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:24.489000 audit[3619]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c0001a0218 a2=98 a3=0 items=0 ppid=3415 pid=3619 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:24.489000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3165663436386462383064333137346166343831333535333833313036 Jan 16 21:17:24.509017 kernel: audit: type=1327 audit(1768598244.489:569): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3165663436386462383064333137346166343831333535333833313036 Jan 16 21:17:24.489000 audit: BPF prog-id=170 op=UNLOAD Jan 16 21:17:24.512094 kernel: audit: type=1334 audit(1768598244.489:570): prog-id=170 op=UNLOAD Jan 16 21:17:24.489000 audit[3619]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3415 pid=3619 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:24.515336 kernel: audit: type=1300 audit(1768598244.489:570): arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3415 pid=3619 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:24.489000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3165663436386462383064333137346166343831333535333833313036 Jan 16 21:17:24.519505 kernel: audit: type=1327 audit(1768598244.489:570): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3165663436386462383064333137346166343831333535333833313036 Jan 16 21:17:24.489000 audit: BPF prog-id=169 op=UNLOAD Jan 16 21:17:24.523019 kernel: audit: type=1334 audit(1768598244.489:571): prog-id=169 op=UNLOAD Jan 16 21:17:24.489000 audit[3619]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3415 pid=3619 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:24.489000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3165663436386462383064333137346166343831333535333833313036 Jan 16 21:17:24.489000 audit: BPF prog-id=171 op=LOAD Jan 16 21:17:24.489000 audit[3619]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a06e8 a2=98 a3=0 items=0 ppid=3415 pid=3619 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:24.489000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3165663436386462383064333137346166343831333535333833313036 Jan 16 21:17:24.531827 containerd[1678]: time="2026-01-16T21:17:24.531777448Z" level=info msg="StartContainer for \"1ef468db80d3174af48135538310668fce322dc79ef77d21f70c7263c493fd82\" returns successfully" Jan 16 21:17:25.670251 kubelet[2882]: E0116 21:17:25.669953 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kql5z" podUID="a7efe392-df52-4d70-9d66-17372a93751c" Jan 16 21:17:25.845388 systemd[1]: cri-containerd-1ef468db80d3174af48135538310668fce322dc79ef77d21f70c7263c493fd82.scope: Deactivated successfully. Jan 16 21:17:25.846058 systemd[1]: cri-containerd-1ef468db80d3174af48135538310668fce322dc79ef77d21f70c7263c493fd82.scope: Consumed 476ms CPU time, 201.4M memory peak, 171.3M written to disk. Jan 16 21:17:25.847391 containerd[1678]: time="2026-01-16T21:17:25.847366089Z" level=info msg="received container exit event container_id:\"1ef468db80d3174af48135538310668fce322dc79ef77d21f70c7263c493fd82\" id:\"1ef468db80d3174af48135538310668fce322dc79ef77d21f70c7263c493fd82\" pid:3633 exited_at:{seconds:1768598245 nanos:847172840}" Jan 16 21:17:25.848000 audit: BPF prog-id=171 op=UNLOAD Jan 16 21:17:25.869271 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-1ef468db80d3174af48135538310668fce322dc79ef77d21f70c7263c493fd82-rootfs.mount: Deactivated successfully. Jan 16 21:17:25.894329 kubelet[2882]: I0116 21:17:25.893797 2882 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Jan 16 21:17:26.577798 kubelet[2882]: W0116 21:17:25.949229 2882 reflector.go:569] object-"kube-system"/"coredns": failed to list *v1.ConfigMap: configmaps "coredns" is forbidden: User "system:node:ci-4580-0-0-p-be73a47b79" cannot list resource "configmaps" in API group "" in the namespace "kube-system": no relationship found between node 'ci-4580-0-0-p-be73a47b79' and this object Jan 16 21:17:26.577798 kubelet[2882]: E0116 21:17:25.949267 2882 reflector.go:166] "Unhandled Error" err="object-\"kube-system\"/\"coredns\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"coredns\" is forbidden: User \"system:node:ci-4580-0-0-p-be73a47b79\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"kube-system\": no relationship found between node 'ci-4580-0-0-p-be73a47b79' and this object" logger="UnhandledError" Jan 16 21:17:26.577798 kubelet[2882]: I0116 21:17:25.972620 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/eb6b7565-0bac-4e11-9918-28c46c9c8c58-calico-apiserver-certs\") pod \"calico-apiserver-6787d9fcf6-mcbl2\" (UID: \"eb6b7565-0bac-4e11-9918-28c46c9c8c58\") " pod="calico-apiserver/calico-apiserver-6787d9fcf6-mcbl2" Jan 16 21:17:26.577798 kubelet[2882]: I0116 21:17:25.972651 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btq9w\" (UniqueName: \"kubernetes.io/projected/91564960-b99d-4a80-9373-1c0eb8e68a7f-kube-api-access-btq9w\") pod \"calico-kube-controllers-56f896845d-kkjxr\" (UID: \"91564960-b99d-4a80-9373-1c0eb8e68a7f\") " pod="calico-system/calico-kube-controllers-56f896845d-kkjxr" Jan 16 21:17:26.577798 kubelet[2882]: I0116 21:17:25.972668 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbx5j\" (UniqueName: \"kubernetes.io/projected/88cd36cc-6fbb-4a49-8bfe-3b87f533604b-kube-api-access-fbx5j\") pod \"coredns-668d6bf9bc-sflq7\" (UID: \"88cd36cc-6fbb-4a49-8bfe-3b87f533604b\") " pod="kube-system/coredns-668d6bf9bc-sflq7" Jan 16 21:17:25.942252 systemd[1]: Created slice kubepods-besteffort-pod2007218a_042d_47f9_bd6c_94e42c633f48.slice - libcontainer container kubepods-besteffort-pod2007218a_042d_47f9_bd6c_94e42c633f48.slice. Jan 16 21:17:26.578073 kubelet[2882]: I0116 21:17:25.972702 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2007218a-042d-47f9-bd6c-94e42c633f48-whisker-ca-bundle\") pod \"whisker-768c989948-8nxzs\" (UID: \"2007218a-042d-47f9-bd6c-94e42c633f48\") " pod="calico-system/whisker-768c989948-8nxzs" Jan 16 21:17:26.578073 kubelet[2882]: I0116 21:17:25.972718 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1133c715-3a00-415e-9905-d5dd19553c44-config-volume\") pod \"coredns-668d6bf9bc-mg6kr\" (UID: \"1133c715-3a00-415e-9905-d5dd19553c44\") " pod="kube-system/coredns-668d6bf9bc-mg6kr" Jan 16 21:17:26.578073 kubelet[2882]: I0116 21:17:25.972732 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3048fb4b-f634-4014-ae59-9ad3946acc61-goldmane-ca-bundle\") pod \"goldmane-666569f655-ns7rw\" (UID: \"3048fb4b-f634-4014-ae59-9ad3946acc61\") " pod="calico-system/goldmane-666569f655-ns7rw" Jan 16 21:17:26.578073 kubelet[2882]: I0116 21:17:25.972746 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/3048fb4b-f634-4014-ae59-9ad3946acc61-goldmane-key-pair\") pod \"goldmane-666569f655-ns7rw\" (UID: \"3048fb4b-f634-4014-ae59-9ad3946acc61\") " pod="calico-system/goldmane-666569f655-ns7rw" Jan 16 21:17:26.578073 kubelet[2882]: I0116 21:17:25.972769 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdsvg\" (UniqueName: \"kubernetes.io/projected/ce4d12ad-802f-4a94-a69f-0be1648ae09b-kube-api-access-xdsvg\") pod \"calico-apiserver-6787d9fcf6-9lshv\" (UID: \"ce4d12ad-802f-4a94-a69f-0be1648ae09b\") " pod="calico-apiserver/calico-apiserver-6787d9fcf6-9lshv" Jan 16 21:17:25.967226 systemd[1]: Created slice kubepods-burstable-pod1133c715_3a00_415e_9905_d5dd19553c44.slice - libcontainer container kubepods-burstable-pod1133c715_3a00_415e_9905_d5dd19553c44.slice. Jan 16 21:17:26.578367 kubelet[2882]: I0116 21:17:25.972787 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91564960-b99d-4a80-9373-1c0eb8e68a7f-tigera-ca-bundle\") pod \"calico-kube-controllers-56f896845d-kkjxr\" (UID: \"91564960-b99d-4a80-9373-1c0eb8e68a7f\") " pod="calico-system/calico-kube-controllers-56f896845d-kkjxr" Jan 16 21:17:26.578367 kubelet[2882]: I0116 21:17:25.972804 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72sgl\" (UniqueName: \"kubernetes.io/projected/eb6b7565-0bac-4e11-9918-28c46c9c8c58-kube-api-access-72sgl\") pod \"calico-apiserver-6787d9fcf6-mcbl2\" (UID: \"eb6b7565-0bac-4e11-9918-28c46c9c8c58\") " pod="calico-apiserver/calico-apiserver-6787d9fcf6-mcbl2" Jan 16 21:17:26.578367 kubelet[2882]: I0116 21:17:25.972821 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5vfb\" (UniqueName: \"kubernetes.io/projected/1133c715-3a00-415e-9905-d5dd19553c44-kube-api-access-b5vfb\") pod \"coredns-668d6bf9bc-mg6kr\" (UID: \"1133c715-3a00-415e-9905-d5dd19553c44\") " pod="kube-system/coredns-668d6bf9bc-mg6kr" Jan 16 21:17:26.578367 kubelet[2882]: I0116 21:17:25.972843 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3048fb4b-f634-4014-ae59-9ad3946acc61-config\") pod \"goldmane-666569f655-ns7rw\" (UID: \"3048fb4b-f634-4014-ae59-9ad3946acc61\") " pod="calico-system/goldmane-666569f655-ns7rw" Jan 16 21:17:26.578367 kubelet[2882]: I0116 21:17:25.972857 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twtjv\" (UniqueName: \"kubernetes.io/projected/3048fb4b-f634-4014-ae59-9ad3946acc61-kube-api-access-twtjv\") pod \"goldmane-666569f655-ns7rw\" (UID: \"3048fb4b-f634-4014-ae59-9ad3946acc61\") " pod="calico-system/goldmane-666569f655-ns7rw" Jan 16 21:17:25.977796 systemd[1]: Created slice kubepods-besteffort-pod91564960_b99d_4a80_9373_1c0eb8e68a7f.slice - libcontainer container kubepods-besteffort-pod91564960_b99d_4a80_9373_1c0eb8e68a7f.slice. Jan 16 21:17:26.578526 kubelet[2882]: I0116 21:17:25.972872 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/ce4d12ad-802f-4a94-a69f-0be1648ae09b-calico-apiserver-certs\") pod \"calico-apiserver-6787d9fcf6-9lshv\" (UID: \"ce4d12ad-802f-4a94-a69f-0be1648ae09b\") " pod="calico-apiserver/calico-apiserver-6787d9fcf6-9lshv" Jan 16 21:17:26.578526 kubelet[2882]: I0116 21:17:25.972891 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/88cd36cc-6fbb-4a49-8bfe-3b87f533604b-config-volume\") pod \"coredns-668d6bf9bc-sflq7\" (UID: \"88cd36cc-6fbb-4a49-8bfe-3b87f533604b\") " pod="kube-system/coredns-668d6bf9bc-sflq7" Jan 16 21:17:26.578526 kubelet[2882]: I0116 21:17:25.972915 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsctr\" (UniqueName: \"kubernetes.io/projected/2007218a-042d-47f9-bd6c-94e42c633f48-kube-api-access-nsctr\") pod \"whisker-768c989948-8nxzs\" (UID: \"2007218a-042d-47f9-bd6c-94e42c633f48\") " pod="calico-system/whisker-768c989948-8nxzs" Jan 16 21:17:26.578526 kubelet[2882]: I0116 21:17:25.972931 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/2007218a-042d-47f9-bd6c-94e42c633f48-whisker-backend-key-pair\") pod \"whisker-768c989948-8nxzs\" (UID: \"2007218a-042d-47f9-bd6c-94e42c633f48\") " pod="calico-system/whisker-768c989948-8nxzs" Jan 16 21:17:25.986494 systemd[1]: Created slice kubepods-besteffort-pod3048fb4b_f634_4014_ae59_9ad3946acc61.slice - libcontainer container kubepods-besteffort-pod3048fb4b_f634_4014_ae59_9ad3946acc61.slice. Jan 16 21:17:25.992172 systemd[1]: Created slice kubepods-besteffort-podeb6b7565_0bac_4e11_9918_28c46c9c8c58.slice - libcontainer container kubepods-besteffort-podeb6b7565_0bac_4e11_9918_28c46c9c8c58.slice. Jan 16 21:17:25.999127 systemd[1]: Created slice kubepods-burstable-pod88cd36cc_6fbb_4a49_8bfe_3b87f533604b.slice - libcontainer container kubepods-burstable-pod88cd36cc_6fbb_4a49_8bfe_3b87f533604b.slice. Jan 16 21:17:26.005468 systemd[1]: Created slice kubepods-besteffort-podce4d12ad_802f_4a94_a69f_0be1648ae09b.slice - libcontainer container kubepods-besteffort-podce4d12ad_802f_4a94_a69f_0be1648ae09b.slice. Jan 16 21:17:26.894523 containerd[1678]: time="2026-01-16T21:17:26.894473729Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-768c989948-8nxzs,Uid:2007218a-042d-47f9-bd6c-94e42c633f48,Namespace:calico-system,Attempt:0,}" Jan 16 21:17:26.898718 containerd[1678]: time="2026-01-16T21:17:26.898689257Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-56f896845d-kkjxr,Uid:91564960-b99d-4a80-9373-1c0eb8e68a7f,Namespace:calico-system,Attempt:0,}" Jan 16 21:17:26.898852 containerd[1678]: time="2026-01-16T21:17:26.898774500Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-ns7rw,Uid:3048fb4b-f634-4014-ae59-9ad3946acc61,Namespace:calico-system,Attempt:0,}" Jan 16 21:17:26.898945 containerd[1678]: time="2026-01-16T21:17:26.898881730Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6787d9fcf6-mcbl2,Uid:eb6b7565-0bac-4e11-9918-28c46c9c8c58,Namespace:calico-apiserver,Attempt:0,}" Jan 16 21:17:26.900335 containerd[1678]: time="2026-01-16T21:17:26.900313812Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6787d9fcf6-9lshv,Uid:ce4d12ad-802f-4a94-a69f-0be1648ae09b,Namespace:calico-apiserver,Attempt:0,}" Jan 16 21:17:27.075165 kubelet[2882]: E0116 21:17:27.074325 2882 configmap.go:193] Couldn't get configMap kube-system/coredns: failed to sync configmap cache: timed out waiting for the condition Jan 16 21:17:27.075165 kubelet[2882]: E0116 21:17:27.074438 2882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1133c715-3a00-415e-9905-d5dd19553c44-config-volume podName:1133c715-3a00-415e-9905-d5dd19553c44 nodeName:}" failed. No retries permitted until 2026-01-16 21:17:27.574398976 +0000 UTC m=+32.009337848 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/1133c715-3a00-415e-9905-d5dd19553c44-config-volume") pod "coredns-668d6bf9bc-mg6kr" (UID: "1133c715-3a00-415e-9905-d5dd19553c44") : failed to sync configmap cache: timed out waiting for the condition Jan 16 21:17:27.075549 kubelet[2882]: E0116 21:17:27.075188 2882 configmap.go:193] Couldn't get configMap kube-system/coredns: failed to sync configmap cache: timed out waiting for the condition Jan 16 21:17:27.075549 kubelet[2882]: E0116 21:17:27.075249 2882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/88cd36cc-6fbb-4a49-8bfe-3b87f533604b-config-volume podName:88cd36cc-6fbb-4a49-8bfe-3b87f533604b nodeName:}" failed. No retries permitted until 2026-01-16 21:17:27.57523203 +0000 UTC m=+32.010170888 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/88cd36cc-6fbb-4a49-8bfe-3b87f533604b-config-volume") pod "coredns-668d6bf9bc-sflq7" (UID: "88cd36cc-6fbb-4a49-8bfe-3b87f533604b") : failed to sync configmap cache: timed out waiting for the condition Jan 16 21:17:27.480243 containerd[1678]: time="2026-01-16T21:17:27.480060296Z" level=error msg="Failed to destroy network for sandbox \"487fc3202051e69a3665bdf9ba66e92e86cf2bbca37fdd5b88a32f73958709ad\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 21:17:27.486125 containerd[1678]: time="2026-01-16T21:17:27.486081304Z" level=error msg="Failed to destroy network for sandbox \"375af2c8856f324ae36f9b017f5bd64208c08df4cceb04d82ae2b8e0453ed145\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 21:17:27.486903 containerd[1678]: time="2026-01-16T21:17:27.486869557Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6787d9fcf6-mcbl2,Uid:eb6b7565-0bac-4e11-9918-28c46c9c8c58,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"487fc3202051e69a3665bdf9ba66e92e86cf2bbca37fdd5b88a32f73958709ad\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 21:17:27.487142 kubelet[2882]: E0116 21:17:27.487039 2882 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"487fc3202051e69a3665bdf9ba66e92e86cf2bbca37fdd5b88a32f73958709ad\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 21:17:27.487142 kubelet[2882]: E0116 21:17:27.487102 2882 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"487fc3202051e69a3665bdf9ba66e92e86cf2bbca37fdd5b88a32f73958709ad\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6787d9fcf6-mcbl2" Jan 16 21:17:27.487142 kubelet[2882]: E0116 21:17:27.487121 2882 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"487fc3202051e69a3665bdf9ba66e92e86cf2bbca37fdd5b88a32f73958709ad\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6787d9fcf6-mcbl2" Jan 16 21:17:27.487237 kubelet[2882]: E0116 21:17:27.487153 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6787d9fcf6-mcbl2_calico-apiserver(eb6b7565-0bac-4e11-9918-28c46c9c8c58)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6787d9fcf6-mcbl2_calico-apiserver(eb6b7565-0bac-4e11-9918-28c46c9c8c58)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"487fc3202051e69a3665bdf9ba66e92e86cf2bbca37fdd5b88a32f73958709ad\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6787d9fcf6-mcbl2" podUID="eb6b7565-0bac-4e11-9918-28c46c9c8c58" Jan 16 21:17:27.491221 containerd[1678]: time="2026-01-16T21:17:27.491176053Z" level=error msg="Failed to destroy network for sandbox \"9bb945fd6aa2c3de3a10c3fad8b137b0edc892df0dd88f0bc5964f125250534e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 21:17:27.492744 containerd[1678]: time="2026-01-16T21:17:27.492646726Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-ns7rw,Uid:3048fb4b-f634-4014-ae59-9ad3946acc61,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"375af2c8856f324ae36f9b017f5bd64208c08df4cceb04d82ae2b8e0453ed145\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 21:17:27.492896 kubelet[2882]: E0116 21:17:27.492871 2882 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"375af2c8856f324ae36f9b017f5bd64208c08df4cceb04d82ae2b8e0453ed145\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 21:17:27.492999 kubelet[2882]: E0116 21:17:27.492974 2882 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"375af2c8856f324ae36f9b017f5bd64208c08df4cceb04d82ae2b8e0453ed145\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-ns7rw" Jan 16 21:17:27.493072 kubelet[2882]: E0116 21:17:27.493057 2882 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"375af2c8856f324ae36f9b017f5bd64208c08df4cceb04d82ae2b8e0453ed145\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-ns7rw" Jan 16 21:17:27.493120 kubelet[2882]: E0116 21:17:27.493097 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-666569f655-ns7rw_calico-system(3048fb4b-f634-4014-ae59-9ad3946acc61)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-666569f655-ns7rw_calico-system(3048fb4b-f634-4014-ae59-9ad3946acc61)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"375af2c8856f324ae36f9b017f5bd64208c08df4cceb04d82ae2b8e0453ed145\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-666569f655-ns7rw" podUID="3048fb4b-f634-4014-ae59-9ad3946acc61" Jan 16 21:17:27.496422 containerd[1678]: time="2026-01-16T21:17:27.496375883Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-768c989948-8nxzs,Uid:2007218a-042d-47f9-bd6c-94e42c633f48,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9bb945fd6aa2c3de3a10c3fad8b137b0edc892df0dd88f0bc5964f125250534e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 21:17:27.496754 kubelet[2882]: E0116 21:17:27.496588 2882 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9bb945fd6aa2c3de3a10c3fad8b137b0edc892df0dd88f0bc5964f125250534e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 21:17:27.497092 kubelet[2882]: E0116 21:17:27.496637 2882 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9bb945fd6aa2c3de3a10c3fad8b137b0edc892df0dd88f0bc5964f125250534e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-768c989948-8nxzs" Jan 16 21:17:27.497092 kubelet[2882]: E0116 21:17:27.497071 2882 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9bb945fd6aa2c3de3a10c3fad8b137b0edc892df0dd88f0bc5964f125250534e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-768c989948-8nxzs" Jan 16 21:17:27.497594 kubelet[2882]: E0116 21:17:27.497293 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-768c989948-8nxzs_calico-system(2007218a-042d-47f9-bd6c-94e42c633f48)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-768c989948-8nxzs_calico-system(2007218a-042d-47f9-bd6c-94e42c633f48)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9bb945fd6aa2c3de3a10c3fad8b137b0edc892df0dd88f0bc5964f125250534e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-768c989948-8nxzs" podUID="2007218a-042d-47f9-bd6c-94e42c633f48" Jan 16 21:17:27.508023 containerd[1678]: time="2026-01-16T21:17:27.507901750Z" level=error msg="Failed to destroy network for sandbox \"fba38f143ed58f796ae092da83ea647a0c48402979fe0ad02d477780cf2642cf\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 21:17:27.508666 containerd[1678]: time="2026-01-16T21:17:27.508642401Z" level=error msg="Failed to destroy network for sandbox \"2c822366b8fd5251100aa18b514eb27f7fedb7e0af0276b1b07549aa92a444bb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 21:17:27.512426 containerd[1678]: time="2026-01-16T21:17:27.512388375Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-56f896845d-kkjxr,Uid:91564960-b99d-4a80-9373-1c0eb8e68a7f,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"2c822366b8fd5251100aa18b514eb27f7fedb7e0af0276b1b07549aa92a444bb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 21:17:27.512702 kubelet[2882]: E0116 21:17:27.512580 2882 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2c822366b8fd5251100aa18b514eb27f7fedb7e0af0276b1b07549aa92a444bb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 21:17:27.512702 kubelet[2882]: E0116 21:17:27.512692 2882 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2c822366b8fd5251100aa18b514eb27f7fedb7e0af0276b1b07549aa92a444bb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-56f896845d-kkjxr" Jan 16 21:17:27.512786 kubelet[2882]: E0116 21:17:27.512739 2882 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2c822366b8fd5251100aa18b514eb27f7fedb7e0af0276b1b07549aa92a444bb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-56f896845d-kkjxr" Jan 16 21:17:27.512810 kubelet[2882]: E0116 21:17:27.512777 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-56f896845d-kkjxr_calico-system(91564960-b99d-4a80-9373-1c0eb8e68a7f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-56f896845d-kkjxr_calico-system(91564960-b99d-4a80-9373-1c0eb8e68a7f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2c822366b8fd5251100aa18b514eb27f7fedb7e0af0276b1b07549aa92a444bb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-56f896845d-kkjxr" podUID="91564960-b99d-4a80-9373-1c0eb8e68a7f" Jan 16 21:17:27.513979 containerd[1678]: time="2026-01-16T21:17:27.513897259Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6787d9fcf6-9lshv,Uid:ce4d12ad-802f-4a94-a69f-0be1648ae09b,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"fba38f143ed58f796ae092da83ea647a0c48402979fe0ad02d477780cf2642cf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 21:17:27.514087 kubelet[2882]: E0116 21:17:27.514065 2882 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fba38f143ed58f796ae092da83ea647a0c48402979fe0ad02d477780cf2642cf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 21:17:27.514131 kubelet[2882]: E0116 21:17:27.514105 2882 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fba38f143ed58f796ae092da83ea647a0c48402979fe0ad02d477780cf2642cf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6787d9fcf6-9lshv" Jan 16 21:17:27.514131 kubelet[2882]: E0116 21:17:27.514120 2882 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fba38f143ed58f796ae092da83ea647a0c48402979fe0ad02d477780cf2642cf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6787d9fcf6-9lshv" Jan 16 21:17:27.514175 kubelet[2882]: E0116 21:17:27.514149 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6787d9fcf6-9lshv_calico-apiserver(ce4d12ad-802f-4a94-a69f-0be1648ae09b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6787d9fcf6-9lshv_calico-apiserver(ce4d12ad-802f-4a94-a69f-0be1648ae09b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"fba38f143ed58f796ae092da83ea647a0c48402979fe0ad02d477780cf2642cf\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6787d9fcf6-9lshv" podUID="ce4d12ad-802f-4a94-a69f-0be1648ae09b" Jan 16 21:17:27.674207 systemd[1]: Created slice kubepods-besteffort-poda7efe392_df52_4d70_9d66_17372a93751c.slice - libcontainer container kubepods-besteffort-poda7efe392_df52_4d70_9d66_17372a93751c.slice. Jan 16 21:17:27.676268 containerd[1678]: time="2026-01-16T21:17:27.676079023Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-kql5z,Uid:a7efe392-df52-4d70-9d66-17372a93751c,Namespace:calico-system,Attempt:0,}" Jan 16 21:17:27.728921 containerd[1678]: time="2026-01-16T21:17:27.728882775Z" level=error msg="Failed to destroy network for sandbox \"9067128d3ce6e03e0a9a073cdda138eb4a770f0ba412c24feead6d42cb01a7f9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 21:17:27.732648 containerd[1678]: time="2026-01-16T21:17:27.732476794Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-kql5z,Uid:a7efe392-df52-4d70-9d66-17372a93751c,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9067128d3ce6e03e0a9a073cdda138eb4a770f0ba412c24feead6d42cb01a7f9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 21:17:27.733204 kubelet[2882]: E0116 21:17:27.732892 2882 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9067128d3ce6e03e0a9a073cdda138eb4a770f0ba412c24feead6d42cb01a7f9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 21:17:27.733204 kubelet[2882]: E0116 21:17:27.732941 2882 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9067128d3ce6e03e0a9a073cdda138eb4a770f0ba412c24feead6d42cb01a7f9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-kql5z" Jan 16 21:17:27.733204 kubelet[2882]: E0116 21:17:27.732961 2882 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9067128d3ce6e03e0a9a073cdda138eb4a770f0ba412c24feead6d42cb01a7f9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-kql5z" Jan 16 21:17:27.733324 kubelet[2882]: E0116 21:17:27.733002 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-kql5z_calico-system(a7efe392-df52-4d70-9d66-17372a93751c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-kql5z_calico-system(a7efe392-df52-4d70-9d66-17372a93751c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9067128d3ce6e03e0a9a073cdda138eb4a770f0ba412c24feead6d42cb01a7f9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-kql5z" podUID="a7efe392-df52-4d70-9d66-17372a93751c" Jan 16 21:17:27.771322 containerd[1678]: time="2026-01-16T21:17:27.771269917Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\"" Jan 16 21:17:27.791926 containerd[1678]: time="2026-01-16T21:17:27.791892312Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-mg6kr,Uid:1133c715-3a00-415e-9905-d5dd19553c44,Namespace:kube-system,Attempt:0,}" Jan 16 21:17:27.802032 containerd[1678]: time="2026-01-16T21:17:27.801994048Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-sflq7,Uid:88cd36cc-6fbb-4a49-8bfe-3b87f533604b,Namespace:kube-system,Attempt:0,}" Jan 16 21:17:27.864010 containerd[1678]: time="2026-01-16T21:17:27.863867714Z" level=error msg="Failed to destroy network for sandbox \"8aa369e92b7985faaa5574ec7834dd2c86b479a09a260a7d403390487cde4df8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 21:17:27.865709 containerd[1678]: time="2026-01-16T21:17:27.865648873Z" level=error msg="Failed to destroy network for sandbox \"50650e990c361afa2b26fa25e106861b984f8ed6d18542bee039b238f8b50ee0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 21:17:27.868503 containerd[1678]: time="2026-01-16T21:17:27.868454909Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-mg6kr,Uid:1133c715-3a00-415e-9905-d5dd19553c44,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"50650e990c361afa2b26fa25e106861b984f8ed6d18542bee039b238f8b50ee0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 21:17:27.868967 kubelet[2882]: E0116 21:17:27.868940 2882 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"50650e990c361afa2b26fa25e106861b984f8ed6d18542bee039b238f8b50ee0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 21:17:27.869066 kubelet[2882]: E0116 21:17:27.869055 2882 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"50650e990c361afa2b26fa25e106861b984f8ed6d18542bee039b238f8b50ee0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-mg6kr" Jan 16 21:17:27.869129 kubelet[2882]: E0116 21:17:27.869119 2882 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"50650e990c361afa2b26fa25e106861b984f8ed6d18542bee039b238f8b50ee0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-mg6kr" Jan 16 21:17:27.869200 kubelet[2882]: E0116 21:17:27.869185 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-mg6kr_kube-system(1133c715-3a00-415e-9905-d5dd19553c44)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-mg6kr_kube-system(1133c715-3a00-415e-9905-d5dd19553c44)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"50650e990c361afa2b26fa25e106861b984f8ed6d18542bee039b238f8b50ee0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-mg6kr" podUID="1133c715-3a00-415e-9905-d5dd19553c44" Jan 16 21:17:27.869693 containerd[1678]: time="2026-01-16T21:17:27.869644263Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-sflq7,Uid:88cd36cc-6fbb-4a49-8bfe-3b87f533604b,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8aa369e92b7985faaa5574ec7834dd2c86b479a09a260a7d403390487cde4df8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 21:17:27.870006 kubelet[2882]: E0116 21:17:27.869989 2882 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8aa369e92b7985faaa5574ec7834dd2c86b479a09a260a7d403390487cde4df8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 21:17:27.870195 kubelet[2882]: E0116 21:17:27.870169 2882 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8aa369e92b7985faaa5574ec7834dd2c86b479a09a260a7d403390487cde4df8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-sflq7" Jan 16 21:17:27.870464 kubelet[2882]: E0116 21:17:27.870319 2882 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8aa369e92b7985faaa5574ec7834dd2c86b479a09a260a7d403390487cde4df8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-sflq7" Jan 16 21:17:27.870464 kubelet[2882]: E0116 21:17:27.870356 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-sflq7_kube-system(88cd36cc-6fbb-4a49-8bfe-3b87f533604b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-sflq7_kube-system(88cd36cc-6fbb-4a49-8bfe-3b87f533604b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8aa369e92b7985faaa5574ec7834dd2c86b479a09a260a7d403390487cde4df8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-sflq7" podUID="88cd36cc-6fbb-4a49-8bfe-3b87f533604b" Jan 16 21:17:28.346247 systemd[1]: run-netns-cni\x2d8073a9c8\x2d9b57\x2d44ce\x2d7a7e\x2d11f54b6b5901.mount: Deactivated successfully. Jan 16 21:17:28.346357 systemd[1]: run-netns-cni\x2db05018f6\x2d4543\x2da05b\x2d6003\x2d7ee169c40f1c.mount: Deactivated successfully. Jan 16 21:17:28.346418 systemd[1]: run-netns-cni\x2d2d964de0\x2d0944\x2d259c\x2d9d1c\x2d1d78874cb866.mount: Deactivated successfully. Jan 16 21:17:28.346475 systemd[1]: run-netns-cni\x2df43afca9\x2d4828\x2d7ae2\x2d6d43\x2de9e22b76e6db.mount: Deactivated successfully. Jan 16 21:17:28.346532 systemd[1]: run-netns-cni\x2daf19b77b\x2d76ce\x2d8485\x2d3e46\x2dca84ccb93af8.mount: Deactivated successfully. Jan 16 21:17:31.157658 kubelet[2882]: I0116 21:17:31.157505 2882 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 16 21:17:31.193000 audit[3908]: NETFILTER_CFG table=filter:117 family=2 entries=21 op=nft_register_rule pid=3908 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 21:17:31.195798 kernel: kauditd_printk_skb: 6 callbacks suppressed Jan 16 21:17:31.195906 kernel: audit: type=1325 audit(1768598251.193:574): table=filter:117 family=2 entries=21 op=nft_register_rule pid=3908 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 21:17:31.193000 audit[3908]: SYSCALL arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7ffefffd5550 a2=0 a3=7ffefffd553c items=0 ppid=3028 pid=3908 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:31.204650 kernel: audit: type=1300 audit(1768598251.193:574): arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7ffefffd5550 a2=0 a3=7ffefffd553c items=0 ppid=3028 pid=3908 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:31.193000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 21:17:31.204000 audit[3908]: NETFILTER_CFG table=nat:118 family=2 entries=19 op=nft_register_chain pid=3908 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 21:17:31.209533 kernel: audit: type=1327 audit(1768598251.193:574): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 21:17:31.209591 kernel: audit: type=1325 audit(1768598251.204:575): table=nat:118 family=2 entries=19 op=nft_register_chain pid=3908 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 21:17:31.204000 audit[3908]: SYSCALL arch=c000003e syscall=46 success=yes exit=6276 a0=3 a1=7ffefffd5550 a2=0 a3=7ffefffd553c items=0 ppid=3028 pid=3908 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:31.213385 kernel: audit: type=1300 audit(1768598251.204:575): arch=c000003e syscall=46 success=yes exit=6276 a0=3 a1=7ffefffd5550 a2=0 a3=7ffefffd553c items=0 ppid=3028 pid=3908 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:31.204000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 21:17:31.219763 kernel: audit: type=1327 audit(1768598251.204:575): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 21:17:33.120035 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3457036661.mount: Deactivated successfully. Jan 16 21:17:33.583990 containerd[1678]: time="2026-01-16T21:17:33.583901629Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 21:17:33.590893 containerd[1678]: time="2026-01-16T21:17:33.590832778Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.4: active requests=0, bytes read=156880025" Jan 16 21:17:33.594861 containerd[1678]: time="2026-01-16T21:17:33.594807741Z" level=info msg="ImageCreate event name:\"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 21:17:33.601793 containerd[1678]: time="2026-01-16T21:17:33.601743919Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 21:17:33.602609 containerd[1678]: time="2026-01-16T21:17:33.602552578Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.4\" with image id \"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\", size \"156883537\" in 5.831242374s" Jan 16 21:17:33.602609 containerd[1678]: time="2026-01-16T21:17:33.602582845Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\" returns image reference \"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\"" Jan 16 21:17:33.615293 containerd[1678]: time="2026-01-16T21:17:33.615198388Z" level=info msg="CreateContainer within sandbox \"169fe14b803ed30cd4b9b421ceab557ce4cce0841b625ea39f026a5bc3a17fa1\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Jan 16 21:17:33.642428 containerd[1678]: time="2026-01-16T21:17:33.642388493Z" level=info msg="Container 1774f67ffa0ac384d650c6e0e3273740bf5814cc80644a90b8c5dec2147e8060: CDI devices from CRI Config.CDIDevices: []" Jan 16 21:17:33.653176 containerd[1678]: time="2026-01-16T21:17:33.653121277Z" level=info msg="CreateContainer within sandbox \"169fe14b803ed30cd4b9b421ceab557ce4cce0841b625ea39f026a5bc3a17fa1\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"1774f67ffa0ac384d650c6e0e3273740bf5814cc80644a90b8c5dec2147e8060\"" Jan 16 21:17:33.653933 containerd[1678]: time="2026-01-16T21:17:33.653916130Z" level=info msg="StartContainer for \"1774f67ffa0ac384d650c6e0e3273740bf5814cc80644a90b8c5dec2147e8060\"" Jan 16 21:17:33.656104 containerd[1678]: time="2026-01-16T21:17:33.656080581Z" level=info msg="connecting to shim 1774f67ffa0ac384d650c6e0e3273740bf5814cc80644a90b8c5dec2147e8060" address="unix:///run/containerd/s/9d64bde0cf708898bf107573483d6e74fbfe1a121bdd47462cbca4c68e57e7c5" protocol=ttrpc version=3 Jan 16 21:17:33.709799 systemd[1]: Started cri-containerd-1774f67ffa0ac384d650c6e0e3273740bf5814cc80644a90b8c5dec2147e8060.scope - libcontainer container 1774f67ffa0ac384d650c6e0e3273740bf5814cc80644a90b8c5dec2147e8060. Jan 16 21:17:33.761000 audit: BPF prog-id=172 op=LOAD Jan 16 21:17:33.761000 audit[3913]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000106488 a2=98 a3=0 items=0 ppid=3415 pid=3913 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:33.766246 kernel: audit: type=1334 audit(1768598253.761:576): prog-id=172 op=LOAD Jan 16 21:17:33.766320 kernel: audit: type=1300 audit(1768598253.761:576): arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000106488 a2=98 a3=0 items=0 ppid=3415 pid=3913 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:33.761000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3137373466363766666130616333383464363530633665306533323733 Jan 16 21:17:33.770883 kernel: audit: type=1327 audit(1768598253.761:576): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3137373466363766666130616333383464363530633665306533323733 Jan 16 21:17:33.761000 audit: BPF prog-id=173 op=LOAD Jan 16 21:17:33.774054 kernel: audit: type=1334 audit(1768598253.761:577): prog-id=173 op=LOAD Jan 16 21:17:33.761000 audit[3913]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000106218 a2=98 a3=0 items=0 ppid=3415 pid=3913 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:33.761000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3137373466363766666130616333383464363530633665306533323733 Jan 16 21:17:33.761000 audit: BPF prog-id=173 op=UNLOAD Jan 16 21:17:33.761000 audit[3913]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3415 pid=3913 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:33.761000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3137373466363766666130616333383464363530633665306533323733 Jan 16 21:17:33.761000 audit: BPF prog-id=172 op=UNLOAD Jan 16 21:17:33.761000 audit[3913]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3415 pid=3913 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:33.761000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3137373466363766666130616333383464363530633665306533323733 Jan 16 21:17:33.761000 audit: BPF prog-id=174 op=LOAD Jan 16 21:17:33.761000 audit[3913]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001066e8 a2=98 a3=0 items=0 ppid=3415 pid=3913 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:33.761000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3137373466363766666130616333383464363530633665306533323733 Jan 16 21:17:33.796057 containerd[1678]: time="2026-01-16T21:17:33.795975784Z" level=info msg="StartContainer for \"1774f67ffa0ac384d650c6e0e3273740bf5814cc80644a90b8c5dec2147e8060\" returns successfully" Jan 16 21:17:34.020256 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Jan 16 21:17:34.020384 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Jan 16 21:17:34.226546 kubelet[2882]: I0116 21:17:34.226508 2882 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2007218a-042d-47f9-bd6c-94e42c633f48-whisker-ca-bundle\") pod \"2007218a-042d-47f9-bd6c-94e42c633f48\" (UID: \"2007218a-042d-47f9-bd6c-94e42c633f48\") " Jan 16 21:17:34.227105 kubelet[2882]: I0116 21:17:34.226555 2882 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsctr\" (UniqueName: \"kubernetes.io/projected/2007218a-042d-47f9-bd6c-94e42c633f48-kube-api-access-nsctr\") pod \"2007218a-042d-47f9-bd6c-94e42c633f48\" (UID: \"2007218a-042d-47f9-bd6c-94e42c633f48\") " Jan 16 21:17:34.227105 kubelet[2882]: I0116 21:17:34.226580 2882 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/2007218a-042d-47f9-bd6c-94e42c633f48-whisker-backend-key-pair\") pod \"2007218a-042d-47f9-bd6c-94e42c633f48\" (UID: \"2007218a-042d-47f9-bd6c-94e42c633f48\") " Jan 16 21:17:34.227176 kubelet[2882]: I0116 21:17:34.227121 2882 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2007218a-042d-47f9-bd6c-94e42c633f48-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "2007218a-042d-47f9-bd6c-94e42c633f48" (UID: "2007218a-042d-47f9-bd6c-94e42c633f48"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Jan 16 21:17:34.231969 systemd[1]: var-lib-kubelet-pods-2007218a\x2d042d\x2d47f9\x2dbd6c\x2d94e42c633f48-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dnsctr.mount: Deactivated successfully. Jan 16 21:17:34.233449 kubelet[2882]: I0116 21:17:34.232725 2882 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2007218a-042d-47f9-bd6c-94e42c633f48-kube-api-access-nsctr" (OuterVolumeSpecName: "kube-api-access-nsctr") pod "2007218a-042d-47f9-bd6c-94e42c633f48" (UID: "2007218a-042d-47f9-bd6c-94e42c633f48"). InnerVolumeSpecName "kube-api-access-nsctr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Jan 16 21:17:34.236313 systemd[1]: var-lib-kubelet-pods-2007218a\x2d042d\x2d47f9\x2dbd6c\x2d94e42c633f48-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Jan 16 21:17:34.237311 kubelet[2882]: I0116 21:17:34.237276 2882 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2007218a-042d-47f9-bd6c-94e42c633f48-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "2007218a-042d-47f9-bd6c-94e42c633f48" (UID: "2007218a-042d-47f9-bd6c-94e42c633f48"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Jan 16 21:17:34.327461 kubelet[2882]: I0116 21:17:34.327366 2882 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2007218a-042d-47f9-bd6c-94e42c633f48-whisker-ca-bundle\") on node \"ci-4580-0-0-p-be73a47b79\" DevicePath \"\"" Jan 16 21:17:34.327461 kubelet[2882]: I0116 21:17:34.327396 2882 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nsctr\" (UniqueName: \"kubernetes.io/projected/2007218a-042d-47f9-bd6c-94e42c633f48-kube-api-access-nsctr\") on node \"ci-4580-0-0-p-be73a47b79\" DevicePath \"\"" Jan 16 21:17:34.327461 kubelet[2882]: I0116 21:17:34.327406 2882 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/2007218a-042d-47f9-bd6c-94e42c633f48-whisker-backend-key-pair\") on node \"ci-4580-0-0-p-be73a47b79\" DevicePath \"\"" Jan 16 21:17:34.797242 systemd[1]: Removed slice kubepods-besteffort-pod2007218a_042d_47f9_bd6c_94e42c633f48.slice - libcontainer container kubepods-besteffort-pod2007218a_042d_47f9_bd6c_94e42c633f48.slice. Jan 16 21:17:34.815541 kubelet[2882]: I0116 21:17:34.815483 2882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-9x4ft" podStartSLOduration=2.552079398 podStartE2EDuration="19.81546826s" podCreationTimestamp="2026-01-16 21:17:15 +0000 UTC" firstStartedPulling="2026-01-16 21:17:16.339785575 +0000 UTC m=+20.774724433" lastFinishedPulling="2026-01-16 21:17:33.603174437 +0000 UTC m=+38.038113295" observedRunningTime="2026-01-16 21:17:34.814758321 +0000 UTC m=+39.249697200" watchObservedRunningTime="2026-01-16 21:17:34.81546826 +0000 UTC m=+39.250407140" Jan 16 21:17:34.879043 systemd[1]: Created slice kubepods-besteffort-pod96381c87_0aa3_4b10_9efd_84c74923efa3.slice - libcontainer container kubepods-besteffort-pod96381c87_0aa3_4b10_9efd_84c74923efa3.slice. Jan 16 21:17:34.931456 kubelet[2882]: I0116 21:17:34.931418 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96381c87-0aa3-4b10-9efd-84c74923efa3-whisker-ca-bundle\") pod \"whisker-6b5c8f6bc4-rwplp\" (UID: \"96381c87-0aa3-4b10-9efd-84c74923efa3\") " pod="calico-system/whisker-6b5c8f6bc4-rwplp" Jan 16 21:17:34.931456 kubelet[2882]: I0116 21:17:34.931463 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/96381c87-0aa3-4b10-9efd-84c74923efa3-whisker-backend-key-pair\") pod \"whisker-6b5c8f6bc4-rwplp\" (UID: \"96381c87-0aa3-4b10-9efd-84c74923efa3\") " pod="calico-system/whisker-6b5c8f6bc4-rwplp" Jan 16 21:17:34.931649 kubelet[2882]: I0116 21:17:34.931481 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bntxj\" (UniqueName: \"kubernetes.io/projected/96381c87-0aa3-4b10-9efd-84c74923efa3-kube-api-access-bntxj\") pod \"whisker-6b5c8f6bc4-rwplp\" (UID: \"96381c87-0aa3-4b10-9efd-84c74923efa3\") " pod="calico-system/whisker-6b5c8f6bc4-rwplp" Jan 16 21:17:35.183927 containerd[1678]: time="2026-01-16T21:17:35.183874783Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6b5c8f6bc4-rwplp,Uid:96381c87-0aa3-4b10-9efd-84c74923efa3,Namespace:calico-system,Attempt:0,}" Jan 16 21:17:35.442665 systemd-networkd[1571]: cali3f679ffa1b7: Link UP Jan 16 21:17:35.444235 systemd-networkd[1571]: cali3f679ffa1b7: Gained carrier Jan 16 21:17:35.465330 containerd[1678]: 2026-01-16 21:17:35.218 [INFO][4001] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 16 21:17:35.465330 containerd[1678]: 2026-01-16 21:17:35.336 [INFO][4001] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4580--0--0--p--be73a47b79-k8s-whisker--6b5c8f6bc4--rwplp-eth0 whisker-6b5c8f6bc4- calico-system 96381c87-0aa3-4b10-9efd-84c74923efa3 867 0 2026-01-16 21:17:34 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:6b5c8f6bc4 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4580-0-0-p-be73a47b79 whisker-6b5c8f6bc4-rwplp eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali3f679ffa1b7 [] [] }} ContainerID="aeabfb5e248474cc30d88a8a5897b563ccd4b09fa4ffbf1a38566fab9dbf8055" Namespace="calico-system" Pod="whisker-6b5c8f6bc4-rwplp" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-whisker--6b5c8f6bc4--rwplp-" Jan 16 21:17:35.465330 containerd[1678]: 2026-01-16 21:17:35.337 [INFO][4001] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="aeabfb5e248474cc30d88a8a5897b563ccd4b09fa4ffbf1a38566fab9dbf8055" Namespace="calico-system" Pod="whisker-6b5c8f6bc4-rwplp" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-whisker--6b5c8f6bc4--rwplp-eth0" Jan 16 21:17:35.465330 containerd[1678]: 2026-01-16 21:17:35.373 [INFO][4013] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="aeabfb5e248474cc30d88a8a5897b563ccd4b09fa4ffbf1a38566fab9dbf8055" HandleID="k8s-pod-network.aeabfb5e248474cc30d88a8a5897b563ccd4b09fa4ffbf1a38566fab9dbf8055" Workload="ci--4580--0--0--p--be73a47b79-k8s-whisker--6b5c8f6bc4--rwplp-eth0" Jan 16 21:17:35.465574 containerd[1678]: 2026-01-16 21:17:35.373 [INFO][4013] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="aeabfb5e248474cc30d88a8a5897b563ccd4b09fa4ffbf1a38566fab9dbf8055" HandleID="k8s-pod-network.aeabfb5e248474cc30d88a8a5897b563ccd4b09fa4ffbf1a38566fab9dbf8055" Workload="ci--4580--0--0--p--be73a47b79-k8s-whisker--6b5c8f6bc4--rwplp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f590), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4580-0-0-p-be73a47b79", "pod":"whisker-6b5c8f6bc4-rwplp", "timestamp":"2026-01-16 21:17:35.373365573 +0000 UTC"}, Hostname:"ci-4580-0-0-p-be73a47b79", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 16 21:17:35.465574 containerd[1678]: 2026-01-16 21:17:35.373 [INFO][4013] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 16 21:17:35.465574 containerd[1678]: 2026-01-16 21:17:35.373 [INFO][4013] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 16 21:17:35.465574 containerd[1678]: 2026-01-16 21:17:35.374 [INFO][4013] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4580-0-0-p-be73a47b79' Jan 16 21:17:35.465574 containerd[1678]: 2026-01-16 21:17:35.385 [INFO][4013] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.aeabfb5e248474cc30d88a8a5897b563ccd4b09fa4ffbf1a38566fab9dbf8055" host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:35.465574 containerd[1678]: 2026-01-16 21:17:35.390 [INFO][4013] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:35.465574 containerd[1678]: 2026-01-16 21:17:35.394 [INFO][4013] ipam/ipam.go 511: Trying affinity for 192.168.61.192/26 host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:35.465574 containerd[1678]: 2026-01-16 21:17:35.398 [INFO][4013] ipam/ipam.go 158: Attempting to load block cidr=192.168.61.192/26 host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:35.465574 containerd[1678]: 2026-01-16 21:17:35.403 [INFO][4013] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.61.192/26 host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:35.466017 containerd[1678]: 2026-01-16 21:17:35.403 [INFO][4013] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.61.192/26 handle="k8s-pod-network.aeabfb5e248474cc30d88a8a5897b563ccd4b09fa4ffbf1a38566fab9dbf8055" host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:35.466017 containerd[1678]: 2026-01-16 21:17:35.405 [INFO][4013] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.aeabfb5e248474cc30d88a8a5897b563ccd4b09fa4ffbf1a38566fab9dbf8055 Jan 16 21:17:35.466017 containerd[1678]: 2026-01-16 21:17:35.408 [INFO][4013] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.61.192/26 handle="k8s-pod-network.aeabfb5e248474cc30d88a8a5897b563ccd4b09fa4ffbf1a38566fab9dbf8055" host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:35.466017 containerd[1678]: 2026-01-16 21:17:35.417 [INFO][4013] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.61.193/26] block=192.168.61.192/26 handle="k8s-pod-network.aeabfb5e248474cc30d88a8a5897b563ccd4b09fa4ffbf1a38566fab9dbf8055" host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:35.466017 containerd[1678]: 2026-01-16 21:17:35.417 [INFO][4013] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.61.193/26] handle="k8s-pod-network.aeabfb5e248474cc30d88a8a5897b563ccd4b09fa4ffbf1a38566fab9dbf8055" host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:35.466017 containerd[1678]: 2026-01-16 21:17:35.417 [INFO][4013] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 16 21:17:35.466017 containerd[1678]: 2026-01-16 21:17:35.417 [INFO][4013] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.61.193/26] IPv6=[] ContainerID="aeabfb5e248474cc30d88a8a5897b563ccd4b09fa4ffbf1a38566fab9dbf8055" HandleID="k8s-pod-network.aeabfb5e248474cc30d88a8a5897b563ccd4b09fa4ffbf1a38566fab9dbf8055" Workload="ci--4580--0--0--p--be73a47b79-k8s-whisker--6b5c8f6bc4--rwplp-eth0" Jan 16 21:17:35.466155 containerd[1678]: 2026-01-16 21:17:35.421 [INFO][4001] cni-plugin/k8s.go 418: Populated endpoint ContainerID="aeabfb5e248474cc30d88a8a5897b563ccd4b09fa4ffbf1a38566fab9dbf8055" Namespace="calico-system" Pod="whisker-6b5c8f6bc4-rwplp" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-whisker--6b5c8f6bc4--rwplp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--be73a47b79-k8s-whisker--6b5c8f6bc4--rwplp-eth0", GenerateName:"whisker-6b5c8f6bc4-", Namespace:"calico-system", SelfLink:"", UID:"96381c87-0aa3-4b10-9efd-84c74923efa3", ResourceVersion:"867", Generation:0, CreationTimestamp:time.Date(2026, time.January, 16, 21, 17, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"6b5c8f6bc4", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-be73a47b79", ContainerID:"", Pod:"whisker-6b5c8f6bc4-rwplp", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.61.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali3f679ffa1b7", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 16 21:17:35.466155 containerd[1678]: 2026-01-16 21:17:35.421 [INFO][4001] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.61.193/32] ContainerID="aeabfb5e248474cc30d88a8a5897b563ccd4b09fa4ffbf1a38566fab9dbf8055" Namespace="calico-system" Pod="whisker-6b5c8f6bc4-rwplp" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-whisker--6b5c8f6bc4--rwplp-eth0" Jan 16 21:17:35.466233 containerd[1678]: 2026-01-16 21:17:35.421 [INFO][4001] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3f679ffa1b7 ContainerID="aeabfb5e248474cc30d88a8a5897b563ccd4b09fa4ffbf1a38566fab9dbf8055" Namespace="calico-system" Pod="whisker-6b5c8f6bc4-rwplp" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-whisker--6b5c8f6bc4--rwplp-eth0" Jan 16 21:17:35.466233 containerd[1678]: 2026-01-16 21:17:35.448 [INFO][4001] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="aeabfb5e248474cc30d88a8a5897b563ccd4b09fa4ffbf1a38566fab9dbf8055" Namespace="calico-system" Pod="whisker-6b5c8f6bc4-rwplp" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-whisker--6b5c8f6bc4--rwplp-eth0" Jan 16 21:17:35.466274 containerd[1678]: 2026-01-16 21:17:35.449 [INFO][4001] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="aeabfb5e248474cc30d88a8a5897b563ccd4b09fa4ffbf1a38566fab9dbf8055" Namespace="calico-system" Pod="whisker-6b5c8f6bc4-rwplp" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-whisker--6b5c8f6bc4--rwplp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--be73a47b79-k8s-whisker--6b5c8f6bc4--rwplp-eth0", GenerateName:"whisker-6b5c8f6bc4-", Namespace:"calico-system", SelfLink:"", UID:"96381c87-0aa3-4b10-9efd-84c74923efa3", ResourceVersion:"867", Generation:0, CreationTimestamp:time.Date(2026, time.January, 16, 21, 17, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"6b5c8f6bc4", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-be73a47b79", ContainerID:"aeabfb5e248474cc30d88a8a5897b563ccd4b09fa4ffbf1a38566fab9dbf8055", Pod:"whisker-6b5c8f6bc4-rwplp", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.61.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali3f679ffa1b7", MAC:"6e:f9:ea:20:74:a3", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 16 21:17:35.466325 containerd[1678]: 2026-01-16 21:17:35.461 [INFO][4001] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="aeabfb5e248474cc30d88a8a5897b563ccd4b09fa4ffbf1a38566fab9dbf8055" Namespace="calico-system" Pod="whisker-6b5c8f6bc4-rwplp" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-whisker--6b5c8f6bc4--rwplp-eth0" Jan 16 21:17:35.541842 containerd[1678]: time="2026-01-16T21:17:35.541699260Z" level=info msg="connecting to shim aeabfb5e248474cc30d88a8a5897b563ccd4b09fa4ffbf1a38566fab9dbf8055" address="unix:///run/containerd/s/f63a96c1086687ce2ad8e69d0c2728669c5eb448090e81db97fdbfef181b04d6" namespace=k8s.io protocol=ttrpc version=3 Jan 16 21:17:35.585865 systemd[1]: Started cri-containerd-aeabfb5e248474cc30d88a8a5897b563ccd4b09fa4ffbf1a38566fab9dbf8055.scope - libcontainer container aeabfb5e248474cc30d88a8a5897b563ccd4b09fa4ffbf1a38566fab9dbf8055. Jan 16 21:17:35.601000 audit: BPF prog-id=175 op=LOAD Jan 16 21:17:35.602000 audit: BPF prog-id=176 op=LOAD Jan 16 21:17:35.602000 audit[4130]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a0238 a2=98 a3=0 items=0 ppid=4119 pid=4130 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:35.602000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6165616266623565323438343734636333306438386138613538393762 Jan 16 21:17:35.604000 audit: BPF prog-id=176 op=UNLOAD Jan 16 21:17:35.604000 audit[4130]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4119 pid=4130 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:35.604000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6165616266623565323438343734636333306438386138613538393762 Jan 16 21:17:35.604000 audit: BPF prog-id=177 op=LOAD Jan 16 21:17:35.604000 audit[4130]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a0488 a2=98 a3=0 items=0 ppid=4119 pid=4130 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:35.604000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6165616266623565323438343734636333306438386138613538393762 Jan 16 21:17:35.604000 audit: BPF prog-id=178 op=LOAD Jan 16 21:17:35.604000 audit[4130]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001a0218 a2=98 a3=0 items=0 ppid=4119 pid=4130 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:35.604000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6165616266623565323438343734636333306438386138613538393762 Jan 16 21:17:35.604000 audit: BPF prog-id=178 op=UNLOAD Jan 16 21:17:35.604000 audit[4130]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4119 pid=4130 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:35.604000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6165616266623565323438343734636333306438386138613538393762 Jan 16 21:17:35.604000 audit: BPF prog-id=177 op=UNLOAD Jan 16 21:17:35.604000 audit[4130]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4119 pid=4130 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:35.604000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6165616266623565323438343734636333306438386138613538393762 Jan 16 21:17:35.604000 audit: BPF prog-id=179 op=LOAD Jan 16 21:17:35.604000 audit[4130]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a06e8 a2=98 a3=0 items=0 ppid=4119 pid=4130 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:35.604000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6165616266623565323438343734636333306438386138613538393762 Jan 16 21:17:35.677235 kubelet[2882]: I0116 21:17:35.677199 2882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2007218a-042d-47f9-bd6c-94e42c633f48" path="/var/lib/kubelet/pods/2007218a-042d-47f9-bd6c-94e42c633f48/volumes" Jan 16 21:17:35.741996 containerd[1678]: time="2026-01-16T21:17:35.741825498Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6b5c8f6bc4-rwplp,Uid:96381c87-0aa3-4b10-9efd-84c74923efa3,Namespace:calico-system,Attempt:0,} returns sandbox id \"aeabfb5e248474cc30d88a8a5897b563ccd4b09fa4ffbf1a38566fab9dbf8055\"" Jan 16 21:17:35.745950 containerd[1678]: time="2026-01-16T21:17:35.745921368Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 16 21:17:35.793000 audit: BPF prog-id=180 op=LOAD Jan 16 21:17:35.793000 audit[4189]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7fff32bbe600 a2=98 a3=1fffffffffffffff items=0 ppid=4045 pid=4189 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:35.793000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 16 21:17:35.793000 audit: BPF prog-id=180 op=UNLOAD Jan 16 21:17:35.793000 audit[4189]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7fff32bbe5d0 a3=0 items=0 ppid=4045 pid=4189 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:35.793000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 16 21:17:35.793000 audit: BPF prog-id=181 op=LOAD Jan 16 21:17:35.793000 audit[4189]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7fff32bbe4e0 a2=94 a3=3 items=0 ppid=4045 pid=4189 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:35.793000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 16 21:17:35.793000 audit: BPF prog-id=181 op=UNLOAD Jan 16 21:17:35.793000 audit[4189]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7fff32bbe4e0 a2=94 a3=3 items=0 ppid=4045 pid=4189 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:35.793000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 16 21:17:35.793000 audit: BPF prog-id=182 op=LOAD Jan 16 21:17:35.793000 audit[4189]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7fff32bbe520 a2=94 a3=7fff32bbe700 items=0 ppid=4045 pid=4189 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:35.793000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 16 21:17:35.793000 audit: BPF prog-id=182 op=UNLOAD Jan 16 21:17:35.793000 audit[4189]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7fff32bbe520 a2=94 a3=7fff32bbe700 items=0 ppid=4045 pid=4189 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:35.793000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 16 21:17:35.794000 audit: BPF prog-id=183 op=LOAD Jan 16 21:17:35.794000 audit[4190]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7fff958faa80 a2=98 a3=3 items=0 ppid=4045 pid=4190 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:35.794000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 16 21:17:35.795000 audit: BPF prog-id=183 op=UNLOAD Jan 16 21:17:35.795000 audit[4190]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7fff958faa50 a3=0 items=0 ppid=4045 pid=4190 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:35.795000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 16 21:17:35.795000 audit: BPF prog-id=184 op=LOAD Jan 16 21:17:35.795000 audit[4190]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7fff958fa870 a2=94 a3=54428f items=0 ppid=4045 pid=4190 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:35.795000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 16 21:17:35.795000 audit: BPF prog-id=184 op=UNLOAD Jan 16 21:17:35.795000 audit[4190]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7fff958fa870 a2=94 a3=54428f items=0 ppid=4045 pid=4190 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:35.795000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 16 21:17:35.795000 audit: BPF prog-id=185 op=LOAD Jan 16 21:17:35.795000 audit[4190]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7fff958fa8a0 a2=94 a3=2 items=0 ppid=4045 pid=4190 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:35.795000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 16 21:17:35.795000 audit: BPF prog-id=185 op=UNLOAD Jan 16 21:17:35.795000 audit[4190]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7fff958fa8a0 a2=0 a3=2 items=0 ppid=4045 pid=4190 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:35.795000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 16 21:17:35.982000 audit: BPF prog-id=186 op=LOAD Jan 16 21:17:35.982000 audit[4190]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7fff958fa760 a2=94 a3=1 items=0 ppid=4045 pid=4190 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:35.982000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 16 21:17:35.982000 audit: BPF prog-id=186 op=UNLOAD Jan 16 21:17:35.982000 audit[4190]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7fff958fa760 a2=94 a3=1 items=0 ppid=4045 pid=4190 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:35.982000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 16 21:17:35.993000 audit: BPF prog-id=187 op=LOAD Jan 16 21:17:35.993000 audit[4190]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7fff958fa750 a2=94 a3=4 items=0 ppid=4045 pid=4190 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:35.993000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 16 21:17:35.993000 audit: BPF prog-id=187 op=UNLOAD Jan 16 21:17:35.993000 audit[4190]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7fff958fa750 a2=0 a3=4 items=0 ppid=4045 pid=4190 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:35.993000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 16 21:17:35.993000 audit: BPF prog-id=188 op=LOAD Jan 16 21:17:35.993000 audit[4190]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7fff958fa5b0 a2=94 a3=5 items=0 ppid=4045 pid=4190 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:35.993000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 16 21:17:35.993000 audit: BPF prog-id=188 op=UNLOAD Jan 16 21:17:35.993000 audit[4190]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7fff958fa5b0 a2=0 a3=5 items=0 ppid=4045 pid=4190 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:35.993000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 16 21:17:35.993000 audit: BPF prog-id=189 op=LOAD Jan 16 21:17:35.993000 audit[4190]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7fff958fa7d0 a2=94 a3=6 items=0 ppid=4045 pid=4190 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:35.993000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 16 21:17:35.993000 audit: BPF prog-id=189 op=UNLOAD Jan 16 21:17:35.993000 audit[4190]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7fff958fa7d0 a2=0 a3=6 items=0 ppid=4045 pid=4190 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:35.993000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 16 21:17:35.993000 audit: BPF prog-id=190 op=LOAD Jan 16 21:17:35.993000 audit[4190]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7fff958f9f80 a2=94 a3=88 items=0 ppid=4045 pid=4190 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:35.993000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 16 21:17:35.993000 audit: BPF prog-id=191 op=LOAD Jan 16 21:17:35.993000 audit[4190]: SYSCALL arch=c000003e syscall=321 success=yes exit=7 a0=5 a1=7fff958f9e00 a2=94 a3=2 items=0 ppid=4045 pid=4190 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:35.993000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 16 21:17:35.993000 audit: BPF prog-id=191 op=UNLOAD Jan 16 21:17:35.993000 audit[4190]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=7 a1=7fff958f9e30 a2=0 a3=7fff958f9f30 items=0 ppid=4045 pid=4190 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:35.993000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 16 21:17:35.994000 audit: BPF prog-id=190 op=UNLOAD Jan 16 21:17:35.994000 audit[4190]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=158e4d10 a2=0 a3=d73ee76c0b723b1d items=0 ppid=4045 pid=4190 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:35.994000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 16 21:17:36.004000 audit: BPF prog-id=192 op=LOAD Jan 16 21:17:36.004000 audit[4219]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffe31953060 a2=98 a3=1999999999999999 items=0 ppid=4045 pid=4219 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:36.004000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 16 21:17:36.004000 audit: BPF prog-id=192 op=UNLOAD Jan 16 21:17:36.004000 audit[4219]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7ffe31953030 a3=0 items=0 ppid=4045 pid=4219 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:36.004000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 16 21:17:36.004000 audit: BPF prog-id=193 op=LOAD Jan 16 21:17:36.004000 audit[4219]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffe31952f40 a2=94 a3=ffff items=0 ppid=4045 pid=4219 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:36.004000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 16 21:17:36.004000 audit: BPF prog-id=193 op=UNLOAD Jan 16 21:17:36.004000 audit[4219]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffe31952f40 a2=94 a3=ffff items=0 ppid=4045 pid=4219 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:36.004000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 16 21:17:36.004000 audit: BPF prog-id=194 op=LOAD Jan 16 21:17:36.004000 audit[4219]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffe31952f80 a2=94 a3=7ffe31953160 items=0 ppid=4045 pid=4219 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:36.004000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 16 21:17:36.004000 audit: BPF prog-id=194 op=UNLOAD Jan 16 21:17:36.004000 audit[4219]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffe31952f80 a2=94 a3=7ffe31953160 items=0 ppid=4045 pid=4219 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:36.004000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 16 21:17:36.067950 systemd-networkd[1571]: vxlan.calico: Link UP Jan 16 21:17:36.067959 systemd-networkd[1571]: vxlan.calico: Gained carrier Jan 16 21:17:36.087630 containerd[1678]: time="2026-01-16T21:17:36.087586219Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 21:17:36.088000 audit: BPF prog-id=195 op=LOAD Jan 16 21:17:36.088000 audit[4244]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffc9a2ee610 a2=98 a3=0 items=0 ppid=4045 pid=4244 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:36.088000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 16 21:17:36.088000 audit: BPF prog-id=195 op=UNLOAD Jan 16 21:17:36.088000 audit[4244]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7ffc9a2ee5e0 a3=0 items=0 ppid=4045 pid=4244 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:36.088000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 16 21:17:36.088000 audit: BPF prog-id=196 op=LOAD Jan 16 21:17:36.088000 audit[4244]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffc9a2ee420 a2=94 a3=54428f items=0 ppid=4045 pid=4244 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:36.088000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 16 21:17:36.088000 audit: BPF prog-id=196 op=UNLOAD Jan 16 21:17:36.088000 audit[4244]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffc9a2ee420 a2=94 a3=54428f items=0 ppid=4045 pid=4244 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:36.088000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 16 21:17:36.088000 audit: BPF prog-id=197 op=LOAD Jan 16 21:17:36.088000 audit[4244]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffc9a2ee450 a2=94 a3=2 items=0 ppid=4045 pid=4244 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:36.088000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 16 21:17:36.088000 audit: BPF prog-id=197 op=UNLOAD Jan 16 21:17:36.088000 audit[4244]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffc9a2ee450 a2=0 a3=2 items=0 ppid=4045 pid=4244 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:36.088000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 16 21:17:36.088000 audit: BPF prog-id=198 op=LOAD Jan 16 21:17:36.088000 audit[4244]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffc9a2ee200 a2=94 a3=4 items=0 ppid=4045 pid=4244 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:36.088000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 16 21:17:36.089000 audit: BPF prog-id=198 op=UNLOAD Jan 16 21:17:36.089000 audit[4244]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7ffc9a2ee200 a2=94 a3=4 items=0 ppid=4045 pid=4244 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:36.089000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 16 21:17:36.089000 audit: BPF prog-id=199 op=LOAD Jan 16 21:17:36.089000 audit[4244]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffc9a2ee300 a2=94 a3=7ffc9a2ee480 items=0 ppid=4045 pid=4244 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:36.089000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 16 21:17:36.089000 audit: BPF prog-id=199 op=UNLOAD Jan 16 21:17:36.089000 audit[4244]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7ffc9a2ee300 a2=0 a3=7ffc9a2ee480 items=0 ppid=4045 pid=4244 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:36.089000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 16 21:17:36.089000 audit: BPF prog-id=200 op=LOAD Jan 16 21:17:36.089000 audit[4244]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffc9a2eda30 a2=94 a3=2 items=0 ppid=4045 pid=4244 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:36.089000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 16 21:17:36.089000 audit: BPF prog-id=200 op=UNLOAD Jan 16 21:17:36.089000 audit[4244]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7ffc9a2eda30 a2=0 a3=2 items=0 ppid=4045 pid=4244 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:36.089000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 16 21:17:36.089000 audit: BPF prog-id=201 op=LOAD Jan 16 21:17:36.089000 audit[4244]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffc9a2edb30 a2=94 a3=30 items=0 ppid=4045 pid=4244 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:36.089000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 16 21:17:36.093345 containerd[1678]: time="2026-01-16T21:17:36.091098822Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 16 21:17:36.093345 containerd[1678]: time="2026-01-16T21:17:36.091184918Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 16 21:17:36.093409 kubelet[2882]: E0116 21:17:36.091389 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 16 21:17:36.093409 kubelet[2882]: E0116 21:17:36.091449 2882 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 16 21:17:36.100351 kubelet[2882]: E0116 21:17:36.099661 2882 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:4aba7cb893c2466794c6629c238ba8cf,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bntxj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-6b5c8f6bc4-rwplp_calico-system(96381c87-0aa3-4b10-9efd-84c74923efa3): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 16 21:17:36.101742 containerd[1678]: time="2026-01-16T21:17:36.101686434Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 16 21:17:36.103000 audit: BPF prog-id=202 op=LOAD Jan 16 21:17:36.103000 audit[4250]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffcf1a11290 a2=98 a3=0 items=0 ppid=4045 pid=4250 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:36.103000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 16 21:17:36.104000 audit: BPF prog-id=202 op=UNLOAD Jan 16 21:17:36.104000 audit[4250]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7ffcf1a11260 a3=0 items=0 ppid=4045 pid=4250 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:36.104000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 16 21:17:36.104000 audit: BPF prog-id=203 op=LOAD Jan 16 21:17:36.104000 audit[4250]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffcf1a11080 a2=94 a3=54428f items=0 ppid=4045 pid=4250 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:36.104000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 16 21:17:36.104000 audit: BPF prog-id=203 op=UNLOAD Jan 16 21:17:36.104000 audit[4250]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffcf1a11080 a2=94 a3=54428f items=0 ppid=4045 pid=4250 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:36.104000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 16 21:17:36.104000 audit: BPF prog-id=204 op=LOAD Jan 16 21:17:36.104000 audit[4250]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffcf1a110b0 a2=94 a3=2 items=0 ppid=4045 pid=4250 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:36.104000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 16 21:17:36.104000 audit: BPF prog-id=204 op=UNLOAD Jan 16 21:17:36.104000 audit[4250]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffcf1a110b0 a2=0 a3=2 items=0 ppid=4045 pid=4250 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:36.104000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 16 21:17:36.265000 audit: BPF prog-id=205 op=LOAD Jan 16 21:17:36.267531 kernel: kauditd_printk_skb: 180 callbacks suppressed Jan 16 21:17:36.267579 kernel: audit: type=1334 audit(1768598256.265:638): prog-id=205 op=LOAD Jan 16 21:17:36.265000 audit[4250]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffcf1a10f70 a2=94 a3=1 items=0 ppid=4045 pid=4250 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:36.274620 kernel: audit: type=1300 audit(1768598256.265:638): arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffcf1a10f70 a2=94 a3=1 items=0 ppid=4045 pid=4250 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:36.265000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 16 21:17:36.277617 kernel: audit: type=1327 audit(1768598256.265:638): proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 16 21:17:36.265000 audit: BPF prog-id=205 op=UNLOAD Jan 16 21:17:36.280618 kernel: audit: type=1334 audit(1768598256.265:639): prog-id=205 op=UNLOAD Jan 16 21:17:36.265000 audit[4250]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffcf1a10f70 a2=94 a3=1 items=0 ppid=4045 pid=4250 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:36.265000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 16 21:17:36.285750 kernel: audit: type=1300 audit(1768598256.265:639): arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffcf1a10f70 a2=94 a3=1 items=0 ppid=4045 pid=4250 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:36.285792 kernel: audit: type=1327 audit(1768598256.265:639): proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 16 21:17:36.278000 audit: BPF prog-id=206 op=LOAD Jan 16 21:17:36.287999 kernel: audit: type=1334 audit(1768598256.278:640): prog-id=206 op=LOAD Jan 16 21:17:36.289025 kernel: audit: type=1300 audit(1768598256.278:640): arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffcf1a10f60 a2=94 a3=4 items=0 ppid=4045 pid=4250 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:36.278000 audit[4250]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffcf1a10f60 a2=94 a3=4 items=0 ppid=4045 pid=4250 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:36.278000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 16 21:17:36.293426 kernel: audit: type=1327 audit(1768598256.278:640): proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 16 21:17:36.278000 audit: BPF prog-id=206 op=UNLOAD Jan 16 21:17:36.295707 kernel: audit: type=1334 audit(1768598256.278:641): prog-id=206 op=UNLOAD Jan 16 21:17:36.278000 audit[4250]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7ffcf1a10f60 a2=0 a3=4 items=0 ppid=4045 pid=4250 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:36.278000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 16 21:17:36.278000 audit: BPF prog-id=207 op=LOAD Jan 16 21:17:36.278000 audit[4250]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffcf1a10dc0 a2=94 a3=5 items=0 ppid=4045 pid=4250 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:36.278000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 16 21:17:36.278000 audit: BPF prog-id=207 op=UNLOAD Jan 16 21:17:36.278000 audit[4250]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7ffcf1a10dc0 a2=0 a3=5 items=0 ppid=4045 pid=4250 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:36.278000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 16 21:17:36.278000 audit: BPF prog-id=208 op=LOAD Jan 16 21:17:36.278000 audit[4250]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffcf1a10fe0 a2=94 a3=6 items=0 ppid=4045 pid=4250 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:36.278000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 16 21:17:36.278000 audit: BPF prog-id=208 op=UNLOAD Jan 16 21:17:36.278000 audit[4250]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7ffcf1a10fe0 a2=0 a3=6 items=0 ppid=4045 pid=4250 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:36.278000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 16 21:17:36.278000 audit: BPF prog-id=209 op=LOAD Jan 16 21:17:36.278000 audit[4250]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffcf1a10790 a2=94 a3=88 items=0 ppid=4045 pid=4250 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:36.278000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 16 21:17:36.279000 audit: BPF prog-id=210 op=LOAD Jan 16 21:17:36.279000 audit[4250]: SYSCALL arch=c000003e syscall=321 success=yes exit=7 a0=5 a1=7ffcf1a10610 a2=94 a3=2 items=0 ppid=4045 pid=4250 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:36.279000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 16 21:17:36.279000 audit: BPF prog-id=210 op=UNLOAD Jan 16 21:17:36.279000 audit[4250]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=7 a1=7ffcf1a10640 a2=0 a3=7ffcf1a10740 items=0 ppid=4045 pid=4250 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:36.279000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 16 21:17:36.279000 audit: BPF prog-id=209 op=UNLOAD Jan 16 21:17:36.279000 audit[4250]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=18df6d10 a2=0 a3=6b68ec313983bd items=0 ppid=4045 pid=4250 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:36.279000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 16 21:17:36.293000 audit: BPF prog-id=201 op=UNLOAD Jan 16 21:17:36.293000 audit[4045]: SYSCALL arch=c000003e syscall=263 success=yes exit=0 a0=ffffffffffffff9c a1=c00007d0c0 a2=0 a3=0 items=0 ppid=4017 pid=4045 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="calico-node" exe="/usr/bin/calico-node" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:36.293000 audit: PROCTITLE proctitle=63616C69636F2D6E6F6465002D66656C6978 Jan 16 21:17:36.339000 audit[4274]: NETFILTER_CFG table=nat:119 family=2 entries=15 op=nft_register_chain pid=4274 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 16 21:17:36.339000 audit[4274]: SYSCALL arch=c000003e syscall=46 success=yes exit=5084 a0=3 a1=7fff1833d5f0 a2=0 a3=7fff1833d5dc items=0 ppid=4045 pid=4274 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:36.339000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 16 21:17:36.343000 audit[4275]: NETFILTER_CFG table=mangle:120 family=2 entries=16 op=nft_register_chain pid=4275 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 16 21:17:36.343000 audit[4275]: SYSCALL arch=c000003e syscall=46 success=yes exit=6868 a0=3 a1=7ffcffb91370 a2=0 a3=7ffcffb9135c items=0 ppid=4045 pid=4275 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:36.343000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 16 21:17:36.351000 audit[4273]: NETFILTER_CFG table=raw:121 family=2 entries=21 op=nft_register_chain pid=4273 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 16 21:17:36.351000 audit[4273]: SYSCALL arch=c000003e syscall=46 success=yes exit=8452 a0=3 a1=7ffeb9bef840 a2=0 a3=7ffeb9bef82c items=0 ppid=4045 pid=4273 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:36.351000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 16 21:17:36.353000 audit[4277]: NETFILTER_CFG table=filter:122 family=2 entries=94 op=nft_register_chain pid=4277 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 16 21:17:36.353000 audit[4277]: SYSCALL arch=c000003e syscall=46 success=yes exit=53116 a0=3 a1=7fff8a4e6a50 a2=0 a3=7fff8a4e6a3c items=0 ppid=4045 pid=4277 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:36.353000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 16 21:17:36.427977 containerd[1678]: time="2026-01-16T21:17:36.427923567Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 21:17:36.429724 containerd[1678]: time="2026-01-16T21:17:36.429686349Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 16 21:17:36.429800 containerd[1678]: time="2026-01-16T21:17:36.429780955Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 16 21:17:36.430000 kubelet[2882]: E0116 21:17:36.429966 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 16 21:17:36.430397 kubelet[2882]: E0116 21:17:36.430066 2882 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 16 21:17:36.430458 kubelet[2882]: E0116 21:17:36.430174 2882 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bntxj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-6b5c8f6bc4-rwplp_calico-system(96381c87-0aa3-4b10-9efd-84c74923efa3): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 16 21:17:36.431838 kubelet[2882]: E0116 21:17:36.431777 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6b5c8f6bc4-rwplp" podUID="96381c87-0aa3-4b10-9efd-84c74923efa3" Jan 16 21:17:36.802533 kubelet[2882]: E0116 21:17:36.802437 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6b5c8f6bc4-rwplp" podUID="96381c87-0aa3-4b10-9efd-84c74923efa3" Jan 16 21:17:36.841000 audit[4288]: NETFILTER_CFG table=filter:123 family=2 entries=20 op=nft_register_rule pid=4288 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 21:17:36.841000 audit[4288]: SYSCALL arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7ffd723073d0 a2=0 a3=7ffd723073bc items=0 ppid=3028 pid=4288 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:36.841000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 21:17:36.849000 audit[4288]: NETFILTER_CFG table=nat:124 family=2 entries=14 op=nft_register_rule pid=4288 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 21:17:36.849000 audit[4288]: SYSCALL arch=c000003e syscall=46 success=yes exit=3468 a0=3 a1=7ffd723073d0 a2=0 a3=0 items=0 ppid=3028 pid=4288 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:36.849000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 21:17:37.212492 systemd-networkd[1571]: cali3f679ffa1b7: Gained IPv6LL Jan 16 21:17:37.979737 systemd-networkd[1571]: vxlan.calico: Gained IPv6LL Jan 16 21:17:38.670095 containerd[1678]: time="2026-01-16T21:17:38.670039682Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-ns7rw,Uid:3048fb4b-f634-4014-ae59-9ad3946acc61,Namespace:calico-system,Attempt:0,}" Jan 16 21:17:38.779106 systemd-networkd[1571]: cali47b022fe02f: Link UP Jan 16 21:17:38.780011 systemd-networkd[1571]: cali47b022fe02f: Gained carrier Jan 16 21:17:38.792233 containerd[1678]: 2026-01-16 21:17:38.715 [INFO][4292] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4580--0--0--p--be73a47b79-k8s-goldmane--666569f655--ns7rw-eth0 goldmane-666569f655- calico-system 3048fb4b-f634-4014-ae59-9ad3946acc61 792 0 2026-01-16 21:17:13 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:666569f655 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4580-0-0-p-be73a47b79 goldmane-666569f655-ns7rw eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali47b022fe02f [] [] }} ContainerID="6a3ceaf0f54fb756b8aded80ffc3fbb47f07049187b168c6a173b3581646415c" Namespace="calico-system" Pod="goldmane-666569f655-ns7rw" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-goldmane--666569f655--ns7rw-" Jan 16 21:17:38.792233 containerd[1678]: 2026-01-16 21:17:38.715 [INFO][4292] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="6a3ceaf0f54fb756b8aded80ffc3fbb47f07049187b168c6a173b3581646415c" Namespace="calico-system" Pod="goldmane-666569f655-ns7rw" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-goldmane--666569f655--ns7rw-eth0" Jan 16 21:17:38.792233 containerd[1678]: 2026-01-16 21:17:38.746 [INFO][4303] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6a3ceaf0f54fb756b8aded80ffc3fbb47f07049187b168c6a173b3581646415c" HandleID="k8s-pod-network.6a3ceaf0f54fb756b8aded80ffc3fbb47f07049187b168c6a173b3581646415c" Workload="ci--4580--0--0--p--be73a47b79-k8s-goldmane--666569f655--ns7rw-eth0" Jan 16 21:17:38.792406 containerd[1678]: 2026-01-16 21:17:38.746 [INFO][4303] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="6a3ceaf0f54fb756b8aded80ffc3fbb47f07049187b168c6a173b3581646415c" HandleID="k8s-pod-network.6a3ceaf0f54fb756b8aded80ffc3fbb47f07049187b168c6a173b3581646415c" Workload="ci--4580--0--0--p--be73a47b79-k8s-goldmane--666569f655--ns7rw-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f190), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4580-0-0-p-be73a47b79", "pod":"goldmane-666569f655-ns7rw", "timestamp":"2026-01-16 21:17:38.74631458 +0000 UTC"}, Hostname:"ci-4580-0-0-p-be73a47b79", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 16 21:17:38.792406 containerd[1678]: 2026-01-16 21:17:38.746 [INFO][4303] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 16 21:17:38.792406 containerd[1678]: 2026-01-16 21:17:38.746 [INFO][4303] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 16 21:17:38.792406 containerd[1678]: 2026-01-16 21:17:38.746 [INFO][4303] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4580-0-0-p-be73a47b79' Jan 16 21:17:38.792406 containerd[1678]: 2026-01-16 21:17:38.753 [INFO][4303] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.6a3ceaf0f54fb756b8aded80ffc3fbb47f07049187b168c6a173b3581646415c" host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:38.792406 containerd[1678]: 2026-01-16 21:17:38.757 [INFO][4303] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:38.792406 containerd[1678]: 2026-01-16 21:17:38.760 [INFO][4303] ipam/ipam.go 511: Trying affinity for 192.168.61.192/26 host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:38.792406 containerd[1678]: 2026-01-16 21:17:38.762 [INFO][4303] ipam/ipam.go 158: Attempting to load block cidr=192.168.61.192/26 host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:38.792406 containerd[1678]: 2026-01-16 21:17:38.763 [INFO][4303] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.61.192/26 host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:38.792587 containerd[1678]: 2026-01-16 21:17:38.763 [INFO][4303] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.61.192/26 handle="k8s-pod-network.6a3ceaf0f54fb756b8aded80ffc3fbb47f07049187b168c6a173b3581646415c" host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:38.792587 containerd[1678]: 2026-01-16 21:17:38.764 [INFO][4303] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.6a3ceaf0f54fb756b8aded80ffc3fbb47f07049187b168c6a173b3581646415c Jan 16 21:17:38.792587 containerd[1678]: 2026-01-16 21:17:38.769 [INFO][4303] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.61.192/26 handle="k8s-pod-network.6a3ceaf0f54fb756b8aded80ffc3fbb47f07049187b168c6a173b3581646415c" host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:38.792587 containerd[1678]: 2026-01-16 21:17:38.774 [INFO][4303] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.61.194/26] block=192.168.61.192/26 handle="k8s-pod-network.6a3ceaf0f54fb756b8aded80ffc3fbb47f07049187b168c6a173b3581646415c" host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:38.792587 containerd[1678]: 2026-01-16 21:17:38.774 [INFO][4303] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.61.194/26] handle="k8s-pod-network.6a3ceaf0f54fb756b8aded80ffc3fbb47f07049187b168c6a173b3581646415c" host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:38.792587 containerd[1678]: 2026-01-16 21:17:38.775 [INFO][4303] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 16 21:17:38.792587 containerd[1678]: 2026-01-16 21:17:38.775 [INFO][4303] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.61.194/26] IPv6=[] ContainerID="6a3ceaf0f54fb756b8aded80ffc3fbb47f07049187b168c6a173b3581646415c" HandleID="k8s-pod-network.6a3ceaf0f54fb756b8aded80ffc3fbb47f07049187b168c6a173b3581646415c" Workload="ci--4580--0--0--p--be73a47b79-k8s-goldmane--666569f655--ns7rw-eth0" Jan 16 21:17:38.793267 containerd[1678]: 2026-01-16 21:17:38.776 [INFO][4292] cni-plugin/k8s.go 418: Populated endpoint ContainerID="6a3ceaf0f54fb756b8aded80ffc3fbb47f07049187b168c6a173b3581646415c" Namespace="calico-system" Pod="goldmane-666569f655-ns7rw" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-goldmane--666569f655--ns7rw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--be73a47b79-k8s-goldmane--666569f655--ns7rw-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"3048fb4b-f634-4014-ae59-9ad3946acc61", ResourceVersion:"792", Generation:0, CreationTimestamp:time.Date(2026, time.January, 16, 21, 17, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-be73a47b79", ContainerID:"", Pod:"goldmane-666569f655-ns7rw", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.61.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali47b022fe02f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 16 21:17:38.793329 containerd[1678]: 2026-01-16 21:17:38.776 [INFO][4292] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.61.194/32] ContainerID="6a3ceaf0f54fb756b8aded80ffc3fbb47f07049187b168c6a173b3581646415c" Namespace="calico-system" Pod="goldmane-666569f655-ns7rw" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-goldmane--666569f655--ns7rw-eth0" Jan 16 21:17:38.793329 containerd[1678]: 2026-01-16 21:17:38.776 [INFO][4292] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali47b022fe02f ContainerID="6a3ceaf0f54fb756b8aded80ffc3fbb47f07049187b168c6a173b3581646415c" Namespace="calico-system" Pod="goldmane-666569f655-ns7rw" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-goldmane--666569f655--ns7rw-eth0" Jan 16 21:17:38.793329 containerd[1678]: 2026-01-16 21:17:38.779 [INFO][4292] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6a3ceaf0f54fb756b8aded80ffc3fbb47f07049187b168c6a173b3581646415c" Namespace="calico-system" Pod="goldmane-666569f655-ns7rw" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-goldmane--666569f655--ns7rw-eth0" Jan 16 21:17:38.793389 containerd[1678]: 2026-01-16 21:17:38.780 [INFO][4292] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="6a3ceaf0f54fb756b8aded80ffc3fbb47f07049187b168c6a173b3581646415c" Namespace="calico-system" Pod="goldmane-666569f655-ns7rw" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-goldmane--666569f655--ns7rw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--be73a47b79-k8s-goldmane--666569f655--ns7rw-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"3048fb4b-f634-4014-ae59-9ad3946acc61", ResourceVersion:"792", Generation:0, CreationTimestamp:time.Date(2026, time.January, 16, 21, 17, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-be73a47b79", ContainerID:"6a3ceaf0f54fb756b8aded80ffc3fbb47f07049187b168c6a173b3581646415c", Pod:"goldmane-666569f655-ns7rw", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.61.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali47b022fe02f", MAC:"2a:80:27:f1:17:ef", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 16 21:17:38.793440 containerd[1678]: 2026-01-16 21:17:38.789 [INFO][4292] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="6a3ceaf0f54fb756b8aded80ffc3fbb47f07049187b168c6a173b3581646415c" Namespace="calico-system" Pod="goldmane-666569f655-ns7rw" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-goldmane--666569f655--ns7rw-eth0" Jan 16 21:17:38.807000 audit[4317]: NETFILTER_CFG table=filter:125 family=2 entries=44 op=nft_register_chain pid=4317 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 16 21:17:38.807000 audit[4317]: SYSCALL arch=c000003e syscall=46 success=yes exit=25180 a0=3 a1=7ffdaac3b450 a2=0 a3=7ffdaac3b43c items=0 ppid=4045 pid=4317 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:38.807000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 16 21:17:38.818612 containerd[1678]: time="2026-01-16T21:17:38.817971935Z" level=info msg="connecting to shim 6a3ceaf0f54fb756b8aded80ffc3fbb47f07049187b168c6a173b3581646415c" address="unix:///run/containerd/s/ba7d99e667733ca34ade32ca8501c52f8322ce0e57373a2eb082195c0ae961be" namespace=k8s.io protocol=ttrpc version=3 Jan 16 21:17:38.844830 systemd[1]: Started cri-containerd-6a3ceaf0f54fb756b8aded80ffc3fbb47f07049187b168c6a173b3581646415c.scope - libcontainer container 6a3ceaf0f54fb756b8aded80ffc3fbb47f07049187b168c6a173b3581646415c. Jan 16 21:17:38.854000 audit: BPF prog-id=211 op=LOAD Jan 16 21:17:38.854000 audit: BPF prog-id=212 op=LOAD Jan 16 21:17:38.854000 audit[4339]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=4328 pid=4339 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:38.854000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3661336365616630663534666237353662386164656438306666633366 Jan 16 21:17:38.855000 audit: BPF prog-id=212 op=UNLOAD Jan 16 21:17:38.855000 audit[4339]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4328 pid=4339 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:38.855000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3661336365616630663534666237353662386164656438306666633366 Jan 16 21:17:38.855000 audit: BPF prog-id=213 op=LOAD Jan 16 21:17:38.855000 audit[4339]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=4328 pid=4339 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:38.855000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3661336365616630663534666237353662386164656438306666633366 Jan 16 21:17:38.855000 audit: BPF prog-id=214 op=LOAD Jan 16 21:17:38.855000 audit[4339]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=4328 pid=4339 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:38.855000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3661336365616630663534666237353662386164656438306666633366 Jan 16 21:17:38.855000 audit: BPF prog-id=214 op=UNLOAD Jan 16 21:17:38.855000 audit[4339]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4328 pid=4339 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:38.855000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3661336365616630663534666237353662386164656438306666633366 Jan 16 21:17:38.855000 audit: BPF prog-id=213 op=UNLOAD Jan 16 21:17:38.855000 audit[4339]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4328 pid=4339 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:38.855000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3661336365616630663534666237353662386164656438306666633366 Jan 16 21:17:38.855000 audit: BPF prog-id=215 op=LOAD Jan 16 21:17:38.855000 audit[4339]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=4328 pid=4339 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:38.855000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3661336365616630663534666237353662386164656438306666633366 Jan 16 21:17:38.889074 containerd[1678]: time="2026-01-16T21:17:38.889021549Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-ns7rw,Uid:3048fb4b-f634-4014-ae59-9ad3946acc61,Namespace:calico-system,Attempt:0,} returns sandbox id \"6a3ceaf0f54fb756b8aded80ffc3fbb47f07049187b168c6a173b3581646415c\"" Jan 16 21:17:38.890714 containerd[1678]: time="2026-01-16T21:17:38.890655277Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 16 21:17:39.226616 containerd[1678]: time="2026-01-16T21:17:39.226108283Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 21:17:39.227882 containerd[1678]: time="2026-01-16T21:17:39.227772159Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 16 21:17:39.228714 containerd[1678]: time="2026-01-16T21:17:39.227847129Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 16 21:17:39.229150 kubelet[2882]: E0116 21:17:39.228913 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 16 21:17:39.229150 kubelet[2882]: E0116 21:17:39.228956 2882 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 16 21:17:39.229150 kubelet[2882]: E0116 21:17:39.229098 2882 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-twtjv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-ns7rw_calico-system(3048fb4b-f634-4014-ae59-9ad3946acc61): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 16 21:17:39.230682 kubelet[2882]: E0116 21:17:39.230653 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-ns7rw" podUID="3048fb4b-f634-4014-ae59-9ad3946acc61" Jan 16 21:17:39.669896 containerd[1678]: time="2026-01-16T21:17:39.669790659Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-56f896845d-kkjxr,Uid:91564960-b99d-4a80-9373-1c0eb8e68a7f,Namespace:calico-system,Attempt:0,}" Jan 16 21:17:39.782364 systemd-networkd[1571]: calibfc471c9981: Link UP Jan 16 21:17:39.783024 systemd-networkd[1571]: calibfc471c9981: Gained carrier Jan 16 21:17:39.795521 containerd[1678]: 2026-01-16 21:17:39.721 [INFO][4365] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4580--0--0--p--be73a47b79-k8s-calico--kube--controllers--56f896845d--kkjxr-eth0 calico-kube-controllers-56f896845d- calico-system 91564960-b99d-4a80-9373-1c0eb8e68a7f 791 0 2026-01-16 21:17:16 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:56f896845d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4580-0-0-p-be73a47b79 calico-kube-controllers-56f896845d-kkjxr eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calibfc471c9981 [] [] }} ContainerID="22f0075c1b8339b37323b40c22e6d37a532498374d28ab33471654bcf8d89491" Namespace="calico-system" Pod="calico-kube-controllers-56f896845d-kkjxr" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-calico--kube--controllers--56f896845d--kkjxr-" Jan 16 21:17:39.795521 containerd[1678]: 2026-01-16 21:17:39.722 [INFO][4365] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="22f0075c1b8339b37323b40c22e6d37a532498374d28ab33471654bcf8d89491" Namespace="calico-system" Pod="calico-kube-controllers-56f896845d-kkjxr" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-calico--kube--controllers--56f896845d--kkjxr-eth0" Jan 16 21:17:39.795521 containerd[1678]: 2026-01-16 21:17:39.748 [INFO][4376] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="22f0075c1b8339b37323b40c22e6d37a532498374d28ab33471654bcf8d89491" HandleID="k8s-pod-network.22f0075c1b8339b37323b40c22e6d37a532498374d28ab33471654bcf8d89491" Workload="ci--4580--0--0--p--be73a47b79-k8s-calico--kube--controllers--56f896845d--kkjxr-eth0" Jan 16 21:17:39.795925 containerd[1678]: 2026-01-16 21:17:39.748 [INFO][4376] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="22f0075c1b8339b37323b40c22e6d37a532498374d28ab33471654bcf8d89491" HandleID="k8s-pod-network.22f0075c1b8339b37323b40c22e6d37a532498374d28ab33471654bcf8d89491" Workload="ci--4580--0--0--p--be73a47b79-k8s-calico--kube--controllers--56f896845d--kkjxr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f590), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4580-0-0-p-be73a47b79", "pod":"calico-kube-controllers-56f896845d-kkjxr", "timestamp":"2026-01-16 21:17:39.748010947 +0000 UTC"}, Hostname:"ci-4580-0-0-p-be73a47b79", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 16 21:17:39.795925 containerd[1678]: 2026-01-16 21:17:39.748 [INFO][4376] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 16 21:17:39.795925 containerd[1678]: 2026-01-16 21:17:39.748 [INFO][4376] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 16 21:17:39.795925 containerd[1678]: 2026-01-16 21:17:39.748 [INFO][4376] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4580-0-0-p-be73a47b79' Jan 16 21:17:39.795925 containerd[1678]: 2026-01-16 21:17:39.755 [INFO][4376] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.22f0075c1b8339b37323b40c22e6d37a532498374d28ab33471654bcf8d89491" host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:39.795925 containerd[1678]: 2026-01-16 21:17:39.758 [INFO][4376] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:39.795925 containerd[1678]: 2026-01-16 21:17:39.762 [INFO][4376] ipam/ipam.go 511: Trying affinity for 192.168.61.192/26 host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:39.795925 containerd[1678]: 2026-01-16 21:17:39.763 [INFO][4376] ipam/ipam.go 158: Attempting to load block cidr=192.168.61.192/26 host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:39.795925 containerd[1678]: 2026-01-16 21:17:39.765 [INFO][4376] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.61.192/26 host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:39.796132 containerd[1678]: 2026-01-16 21:17:39.766 [INFO][4376] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.61.192/26 handle="k8s-pod-network.22f0075c1b8339b37323b40c22e6d37a532498374d28ab33471654bcf8d89491" host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:39.796132 containerd[1678]: 2026-01-16 21:17:39.768 [INFO][4376] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.22f0075c1b8339b37323b40c22e6d37a532498374d28ab33471654bcf8d89491 Jan 16 21:17:39.796132 containerd[1678]: 2026-01-16 21:17:39.773 [INFO][4376] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.61.192/26 handle="k8s-pod-network.22f0075c1b8339b37323b40c22e6d37a532498374d28ab33471654bcf8d89491" host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:39.796132 containerd[1678]: 2026-01-16 21:17:39.777 [INFO][4376] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.61.195/26] block=192.168.61.192/26 handle="k8s-pod-network.22f0075c1b8339b37323b40c22e6d37a532498374d28ab33471654bcf8d89491" host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:39.796132 containerd[1678]: 2026-01-16 21:17:39.777 [INFO][4376] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.61.195/26] handle="k8s-pod-network.22f0075c1b8339b37323b40c22e6d37a532498374d28ab33471654bcf8d89491" host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:39.796132 containerd[1678]: 2026-01-16 21:17:39.777 [INFO][4376] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 16 21:17:39.796132 containerd[1678]: 2026-01-16 21:17:39.777 [INFO][4376] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.61.195/26] IPv6=[] ContainerID="22f0075c1b8339b37323b40c22e6d37a532498374d28ab33471654bcf8d89491" HandleID="k8s-pod-network.22f0075c1b8339b37323b40c22e6d37a532498374d28ab33471654bcf8d89491" Workload="ci--4580--0--0--p--be73a47b79-k8s-calico--kube--controllers--56f896845d--kkjxr-eth0" Jan 16 21:17:39.796271 containerd[1678]: 2026-01-16 21:17:39.779 [INFO][4365] cni-plugin/k8s.go 418: Populated endpoint ContainerID="22f0075c1b8339b37323b40c22e6d37a532498374d28ab33471654bcf8d89491" Namespace="calico-system" Pod="calico-kube-controllers-56f896845d-kkjxr" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-calico--kube--controllers--56f896845d--kkjxr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--be73a47b79-k8s-calico--kube--controllers--56f896845d--kkjxr-eth0", GenerateName:"calico-kube-controllers-56f896845d-", Namespace:"calico-system", SelfLink:"", UID:"91564960-b99d-4a80-9373-1c0eb8e68a7f", ResourceVersion:"791", Generation:0, CreationTimestamp:time.Date(2026, time.January, 16, 21, 17, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"56f896845d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-be73a47b79", ContainerID:"", Pod:"calico-kube-controllers-56f896845d-kkjxr", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.61.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calibfc471c9981", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 16 21:17:39.796322 containerd[1678]: 2026-01-16 21:17:39.779 [INFO][4365] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.61.195/32] ContainerID="22f0075c1b8339b37323b40c22e6d37a532498374d28ab33471654bcf8d89491" Namespace="calico-system" Pod="calico-kube-controllers-56f896845d-kkjxr" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-calico--kube--controllers--56f896845d--kkjxr-eth0" Jan 16 21:17:39.796322 containerd[1678]: 2026-01-16 21:17:39.779 [INFO][4365] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calibfc471c9981 ContainerID="22f0075c1b8339b37323b40c22e6d37a532498374d28ab33471654bcf8d89491" Namespace="calico-system" Pod="calico-kube-controllers-56f896845d-kkjxr" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-calico--kube--controllers--56f896845d--kkjxr-eth0" Jan 16 21:17:39.796322 containerd[1678]: 2026-01-16 21:17:39.783 [INFO][4365] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="22f0075c1b8339b37323b40c22e6d37a532498374d28ab33471654bcf8d89491" Namespace="calico-system" Pod="calico-kube-controllers-56f896845d-kkjxr" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-calico--kube--controllers--56f896845d--kkjxr-eth0" Jan 16 21:17:39.796385 containerd[1678]: 2026-01-16 21:17:39.783 [INFO][4365] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="22f0075c1b8339b37323b40c22e6d37a532498374d28ab33471654bcf8d89491" Namespace="calico-system" Pod="calico-kube-controllers-56f896845d-kkjxr" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-calico--kube--controllers--56f896845d--kkjxr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--be73a47b79-k8s-calico--kube--controllers--56f896845d--kkjxr-eth0", GenerateName:"calico-kube-controllers-56f896845d-", Namespace:"calico-system", SelfLink:"", UID:"91564960-b99d-4a80-9373-1c0eb8e68a7f", ResourceVersion:"791", Generation:0, CreationTimestamp:time.Date(2026, time.January, 16, 21, 17, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"56f896845d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-be73a47b79", ContainerID:"22f0075c1b8339b37323b40c22e6d37a532498374d28ab33471654bcf8d89491", Pod:"calico-kube-controllers-56f896845d-kkjxr", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.61.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calibfc471c9981", MAC:"6e:37:a5:e1:5e:07", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 16 21:17:39.796433 containerd[1678]: 2026-01-16 21:17:39.791 [INFO][4365] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="22f0075c1b8339b37323b40c22e6d37a532498374d28ab33471654bcf8d89491" Namespace="calico-system" Pod="calico-kube-controllers-56f896845d-kkjxr" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-calico--kube--controllers--56f896845d--kkjxr-eth0" Jan 16 21:17:39.804712 kubelet[2882]: E0116 21:17:39.804685 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-ns7rw" podUID="3048fb4b-f634-4014-ae59-9ad3946acc61" Jan 16 21:17:39.812000 audit[4390]: NETFILTER_CFG table=filter:126 family=2 entries=40 op=nft_register_chain pid=4390 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 16 21:17:39.812000 audit[4390]: SYSCALL arch=c000003e syscall=46 success=yes exit=20764 a0=3 a1=7ffed2d53f40 a2=0 a3=7ffed2d53f2c items=0 ppid=4045 pid=4390 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:39.812000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 16 21:17:39.828212 containerd[1678]: time="2026-01-16T21:17:39.828113074Z" level=info msg="connecting to shim 22f0075c1b8339b37323b40c22e6d37a532498374d28ab33471654bcf8d89491" address="unix:///run/containerd/s/0aa78019e894e0e6bb7135e5ecc5e05d2468e39831c266d167e31e37c98613ad" namespace=k8s.io protocol=ttrpc version=3 Jan 16 21:17:39.841000 audit[4411]: NETFILTER_CFG table=filter:127 family=2 entries=20 op=nft_register_rule pid=4411 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 21:17:39.841000 audit[4411]: SYSCALL arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7ffe1b7ebee0 a2=0 a3=7ffe1b7ebecc items=0 ppid=3028 pid=4411 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:39.841000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 21:17:39.844000 audit[4411]: NETFILTER_CFG table=nat:128 family=2 entries=14 op=nft_register_rule pid=4411 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 21:17:39.844000 audit[4411]: SYSCALL arch=c000003e syscall=46 success=yes exit=3468 a0=3 a1=7ffe1b7ebee0 a2=0 a3=0 items=0 ppid=3028 pid=4411 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:39.844000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 21:17:39.856798 systemd[1]: Started cri-containerd-22f0075c1b8339b37323b40c22e6d37a532498374d28ab33471654bcf8d89491.scope - libcontainer container 22f0075c1b8339b37323b40c22e6d37a532498374d28ab33471654bcf8d89491. Jan 16 21:17:39.866000 audit: BPF prog-id=216 op=LOAD Jan 16 21:17:39.867000 audit: BPF prog-id=217 op=LOAD Jan 16 21:17:39.867000 audit[4413]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=4400 pid=4413 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:39.867000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3232663030373563316238333339623337333233623430633232653664 Jan 16 21:17:39.867000 audit: BPF prog-id=217 op=UNLOAD Jan 16 21:17:39.867000 audit[4413]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4400 pid=4413 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:39.867000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3232663030373563316238333339623337333233623430633232653664 Jan 16 21:17:39.867000 audit: BPF prog-id=218 op=LOAD Jan 16 21:17:39.867000 audit[4413]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=4400 pid=4413 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:39.867000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3232663030373563316238333339623337333233623430633232653664 Jan 16 21:17:39.867000 audit: BPF prog-id=219 op=LOAD Jan 16 21:17:39.867000 audit[4413]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=4400 pid=4413 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:39.867000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3232663030373563316238333339623337333233623430633232653664 Jan 16 21:17:39.867000 audit: BPF prog-id=219 op=UNLOAD Jan 16 21:17:39.867000 audit[4413]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4400 pid=4413 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:39.867000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3232663030373563316238333339623337333233623430633232653664 Jan 16 21:17:39.867000 audit: BPF prog-id=218 op=UNLOAD Jan 16 21:17:39.867000 audit[4413]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4400 pid=4413 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:39.867000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3232663030373563316238333339623337333233623430633232653664 Jan 16 21:17:39.867000 audit: BPF prog-id=220 op=LOAD Jan 16 21:17:39.867000 audit[4413]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=4400 pid=4413 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:39.867000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3232663030373563316238333339623337333233623430633232653664 Jan 16 21:17:39.904312 containerd[1678]: time="2026-01-16T21:17:39.904274115Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-56f896845d-kkjxr,Uid:91564960-b99d-4a80-9373-1c0eb8e68a7f,Namespace:calico-system,Attempt:0,} returns sandbox id \"22f0075c1b8339b37323b40c22e6d37a532498374d28ab33471654bcf8d89491\"" Jan 16 21:17:39.905828 containerd[1678]: time="2026-01-16T21:17:39.905804038Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 16 21:17:40.093735 systemd-networkd[1571]: cali47b022fe02f: Gained IPv6LL Jan 16 21:17:40.230880 containerd[1678]: time="2026-01-16T21:17:40.230681236Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 21:17:40.232644 containerd[1678]: time="2026-01-16T21:17:40.232466188Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 16 21:17:40.232813 containerd[1678]: time="2026-01-16T21:17:40.232669128Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 16 21:17:40.233145 kubelet[2882]: E0116 21:17:40.233081 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 16 21:17:40.233145 kubelet[2882]: E0116 21:17:40.233132 2882 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 16 21:17:40.235640 kubelet[2882]: E0116 21:17:40.233268 2882 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-btq9w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-56f896845d-kkjxr_calico-system(91564960-b99d-4a80-9373-1c0eb8e68a7f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 16 21:17:40.235640 kubelet[2882]: E0116 21:17:40.234589 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-56f896845d-kkjxr" podUID="91564960-b99d-4a80-9373-1c0eb8e68a7f" Jan 16 21:17:40.670276 containerd[1678]: time="2026-01-16T21:17:40.670235707Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-kql5z,Uid:a7efe392-df52-4d70-9d66-17372a93751c,Namespace:calico-system,Attempt:0,}" Jan 16 21:17:40.773553 systemd-networkd[1571]: calif10188df8cb: Link UP Jan 16 21:17:40.774201 systemd-networkd[1571]: calif10188df8cb: Gained carrier Jan 16 21:17:40.789034 containerd[1678]: 2026-01-16 21:17:40.713 [INFO][4439] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4580--0--0--p--be73a47b79-k8s-csi--node--driver--kql5z-eth0 csi-node-driver- calico-system a7efe392-df52-4d70-9d66-17372a93751c 685 0 2026-01-16 21:17:16 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:857b56db8f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4580-0-0-p-be73a47b79 csi-node-driver-kql5z eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calif10188df8cb [] [] }} ContainerID="8b2dd2f92aca1ef17ed9ebdb4ee941e88b726c750478b1a306c6804fcb3f33e2" Namespace="calico-system" Pod="csi-node-driver-kql5z" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-csi--node--driver--kql5z-" Jan 16 21:17:40.789034 containerd[1678]: 2026-01-16 21:17:40.713 [INFO][4439] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8b2dd2f92aca1ef17ed9ebdb4ee941e88b726c750478b1a306c6804fcb3f33e2" Namespace="calico-system" Pod="csi-node-driver-kql5z" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-csi--node--driver--kql5z-eth0" Jan 16 21:17:40.789034 containerd[1678]: 2026-01-16 21:17:40.740 [INFO][4452] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8b2dd2f92aca1ef17ed9ebdb4ee941e88b726c750478b1a306c6804fcb3f33e2" HandleID="k8s-pod-network.8b2dd2f92aca1ef17ed9ebdb4ee941e88b726c750478b1a306c6804fcb3f33e2" Workload="ci--4580--0--0--p--be73a47b79-k8s-csi--node--driver--kql5z-eth0" Jan 16 21:17:40.789346 containerd[1678]: 2026-01-16 21:17:40.740 [INFO][4452] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="8b2dd2f92aca1ef17ed9ebdb4ee941e88b726c750478b1a306c6804fcb3f33e2" HandleID="k8s-pod-network.8b2dd2f92aca1ef17ed9ebdb4ee941e88b726c750478b1a306c6804fcb3f33e2" Workload="ci--4580--0--0--p--be73a47b79-k8s-csi--node--driver--kql5z-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002cb8d0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4580-0-0-p-be73a47b79", "pod":"csi-node-driver-kql5z", "timestamp":"2026-01-16 21:17:40.740105967 +0000 UTC"}, Hostname:"ci-4580-0-0-p-be73a47b79", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 16 21:17:40.789346 containerd[1678]: 2026-01-16 21:17:40.740 [INFO][4452] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 16 21:17:40.789346 containerd[1678]: 2026-01-16 21:17:40.740 [INFO][4452] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 16 21:17:40.789346 containerd[1678]: 2026-01-16 21:17:40.740 [INFO][4452] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4580-0-0-p-be73a47b79' Jan 16 21:17:40.789346 containerd[1678]: 2026-01-16 21:17:40.745 [INFO][4452] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.8b2dd2f92aca1ef17ed9ebdb4ee941e88b726c750478b1a306c6804fcb3f33e2" host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:40.789346 containerd[1678]: 2026-01-16 21:17:40.749 [INFO][4452] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:40.789346 containerd[1678]: 2026-01-16 21:17:40.752 [INFO][4452] ipam/ipam.go 511: Trying affinity for 192.168.61.192/26 host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:40.789346 containerd[1678]: 2026-01-16 21:17:40.753 [INFO][4452] ipam/ipam.go 158: Attempting to load block cidr=192.168.61.192/26 host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:40.789346 containerd[1678]: 2026-01-16 21:17:40.755 [INFO][4452] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.61.192/26 host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:40.790049 containerd[1678]: 2026-01-16 21:17:40.755 [INFO][4452] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.61.192/26 handle="k8s-pod-network.8b2dd2f92aca1ef17ed9ebdb4ee941e88b726c750478b1a306c6804fcb3f33e2" host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:40.790049 containerd[1678]: 2026-01-16 21:17:40.756 [INFO][4452] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.8b2dd2f92aca1ef17ed9ebdb4ee941e88b726c750478b1a306c6804fcb3f33e2 Jan 16 21:17:40.790049 containerd[1678]: 2026-01-16 21:17:40.762 [INFO][4452] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.61.192/26 handle="k8s-pod-network.8b2dd2f92aca1ef17ed9ebdb4ee941e88b726c750478b1a306c6804fcb3f33e2" host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:40.790049 containerd[1678]: 2026-01-16 21:17:40.767 [INFO][4452] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.61.196/26] block=192.168.61.192/26 handle="k8s-pod-network.8b2dd2f92aca1ef17ed9ebdb4ee941e88b726c750478b1a306c6804fcb3f33e2" host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:40.790049 containerd[1678]: 2026-01-16 21:17:40.767 [INFO][4452] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.61.196/26] handle="k8s-pod-network.8b2dd2f92aca1ef17ed9ebdb4ee941e88b726c750478b1a306c6804fcb3f33e2" host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:40.790049 containerd[1678]: 2026-01-16 21:17:40.767 [INFO][4452] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 16 21:17:40.790049 containerd[1678]: 2026-01-16 21:17:40.767 [INFO][4452] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.61.196/26] IPv6=[] ContainerID="8b2dd2f92aca1ef17ed9ebdb4ee941e88b726c750478b1a306c6804fcb3f33e2" HandleID="k8s-pod-network.8b2dd2f92aca1ef17ed9ebdb4ee941e88b726c750478b1a306c6804fcb3f33e2" Workload="ci--4580--0--0--p--be73a47b79-k8s-csi--node--driver--kql5z-eth0" Jan 16 21:17:40.790209 containerd[1678]: 2026-01-16 21:17:40.770 [INFO][4439] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8b2dd2f92aca1ef17ed9ebdb4ee941e88b726c750478b1a306c6804fcb3f33e2" Namespace="calico-system" Pod="csi-node-driver-kql5z" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-csi--node--driver--kql5z-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--be73a47b79-k8s-csi--node--driver--kql5z-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"a7efe392-df52-4d70-9d66-17372a93751c", ResourceVersion:"685", Generation:0, CreationTimestamp:time.Date(2026, time.January, 16, 21, 17, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-be73a47b79", ContainerID:"", Pod:"csi-node-driver-kql5z", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.61.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calif10188df8cb", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 16 21:17:40.790725 containerd[1678]: 2026-01-16 21:17:40.770 [INFO][4439] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.61.196/32] ContainerID="8b2dd2f92aca1ef17ed9ebdb4ee941e88b726c750478b1a306c6804fcb3f33e2" Namespace="calico-system" Pod="csi-node-driver-kql5z" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-csi--node--driver--kql5z-eth0" Jan 16 21:17:40.790725 containerd[1678]: 2026-01-16 21:17:40.770 [INFO][4439] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif10188df8cb ContainerID="8b2dd2f92aca1ef17ed9ebdb4ee941e88b726c750478b1a306c6804fcb3f33e2" Namespace="calico-system" Pod="csi-node-driver-kql5z" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-csi--node--driver--kql5z-eth0" Jan 16 21:17:40.790725 containerd[1678]: 2026-01-16 21:17:40.774 [INFO][4439] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8b2dd2f92aca1ef17ed9ebdb4ee941e88b726c750478b1a306c6804fcb3f33e2" Namespace="calico-system" Pod="csi-node-driver-kql5z" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-csi--node--driver--kql5z-eth0" Jan 16 21:17:40.790797 containerd[1678]: 2026-01-16 21:17:40.775 [INFO][4439] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8b2dd2f92aca1ef17ed9ebdb4ee941e88b726c750478b1a306c6804fcb3f33e2" Namespace="calico-system" Pod="csi-node-driver-kql5z" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-csi--node--driver--kql5z-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--be73a47b79-k8s-csi--node--driver--kql5z-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"a7efe392-df52-4d70-9d66-17372a93751c", ResourceVersion:"685", Generation:0, CreationTimestamp:time.Date(2026, time.January, 16, 21, 17, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-be73a47b79", ContainerID:"8b2dd2f92aca1ef17ed9ebdb4ee941e88b726c750478b1a306c6804fcb3f33e2", Pod:"csi-node-driver-kql5z", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.61.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calif10188df8cb", MAC:"86:05:f0:47:4d:48", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 16 21:17:40.790851 containerd[1678]: 2026-01-16 21:17:40.786 [INFO][4439] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8b2dd2f92aca1ef17ed9ebdb4ee941e88b726c750478b1a306c6804fcb3f33e2" Namespace="calico-system" Pod="csi-node-driver-kql5z" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-csi--node--driver--kql5z-eth0" Jan 16 21:17:40.802000 audit[4466]: NETFILTER_CFG table=filter:129 family=2 entries=44 op=nft_register_chain pid=4466 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 16 21:17:40.802000 audit[4466]: SYSCALL arch=c000003e syscall=46 success=yes exit=21952 a0=3 a1=7ffcd3a4d3a0 a2=0 a3=7ffcd3a4d38c items=0 ppid=4045 pid=4466 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:40.802000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 16 21:17:40.808773 kubelet[2882]: E0116 21:17:40.808732 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-ns7rw" podUID="3048fb4b-f634-4014-ae59-9ad3946acc61" Jan 16 21:17:40.809738 kubelet[2882]: E0116 21:17:40.809377 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-56f896845d-kkjxr" podUID="91564960-b99d-4a80-9373-1c0eb8e68a7f" Jan 16 21:17:40.835576 containerd[1678]: time="2026-01-16T21:17:40.835068863Z" level=info msg="connecting to shim 8b2dd2f92aca1ef17ed9ebdb4ee941e88b726c750478b1a306c6804fcb3f33e2" address="unix:///run/containerd/s/ff8841c2957b7c96c3c150c3503be9d6214465eba325d85b30bb44ebae01a190" namespace=k8s.io protocol=ttrpc version=3 Jan 16 21:17:40.860311 systemd-networkd[1571]: calibfc471c9981: Gained IPv6LL Jan 16 21:17:40.872655 systemd[1]: Started cri-containerd-8b2dd2f92aca1ef17ed9ebdb4ee941e88b726c750478b1a306c6804fcb3f33e2.scope - libcontainer container 8b2dd2f92aca1ef17ed9ebdb4ee941e88b726c750478b1a306c6804fcb3f33e2. Jan 16 21:17:40.896000 audit: BPF prog-id=221 op=LOAD Jan 16 21:17:40.896000 audit: BPF prog-id=222 op=LOAD Jan 16 21:17:40.896000 audit[4485]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=4475 pid=4485 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:40.896000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3862326464326639326163613165663137656439656264623465653934 Jan 16 21:17:40.896000 audit: BPF prog-id=222 op=UNLOAD Jan 16 21:17:40.896000 audit[4485]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4475 pid=4485 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:40.896000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3862326464326639326163613165663137656439656264623465653934 Jan 16 21:17:40.897000 audit: BPF prog-id=223 op=LOAD Jan 16 21:17:40.897000 audit[4485]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=4475 pid=4485 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:40.897000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3862326464326639326163613165663137656439656264623465653934 Jan 16 21:17:40.897000 audit: BPF prog-id=224 op=LOAD Jan 16 21:17:40.897000 audit[4485]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=4475 pid=4485 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:40.897000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3862326464326639326163613165663137656439656264623465653934 Jan 16 21:17:40.897000 audit: BPF prog-id=224 op=UNLOAD Jan 16 21:17:40.897000 audit[4485]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4475 pid=4485 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:40.897000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3862326464326639326163613165663137656439656264623465653934 Jan 16 21:17:40.897000 audit: BPF prog-id=223 op=UNLOAD Jan 16 21:17:40.897000 audit[4485]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4475 pid=4485 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:40.897000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3862326464326639326163613165663137656439656264623465653934 Jan 16 21:17:40.897000 audit: BPF prog-id=225 op=LOAD Jan 16 21:17:40.897000 audit[4485]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=4475 pid=4485 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:40.897000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3862326464326639326163613165663137656439656264623465653934 Jan 16 21:17:40.912505 containerd[1678]: time="2026-01-16T21:17:40.912467405Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-kql5z,Uid:a7efe392-df52-4d70-9d66-17372a93751c,Namespace:calico-system,Attempt:0,} returns sandbox id \"8b2dd2f92aca1ef17ed9ebdb4ee941e88b726c750478b1a306c6804fcb3f33e2\"" Jan 16 21:17:40.914444 containerd[1678]: time="2026-01-16T21:17:40.914416517Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 16 21:17:41.245480 containerd[1678]: time="2026-01-16T21:17:41.245399676Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 21:17:41.247374 containerd[1678]: time="2026-01-16T21:17:41.247338973Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 16 21:17:41.247468 containerd[1678]: time="2026-01-16T21:17:41.247453250Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 16 21:17:41.247696 kubelet[2882]: E0116 21:17:41.247654 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 16 21:17:41.248161 kubelet[2882]: E0116 21:17:41.247715 2882 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 16 21:17:41.248161 kubelet[2882]: E0116 21:17:41.247862 2882 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8wg29,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-kql5z_calico-system(a7efe392-df52-4d70-9d66-17372a93751c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 16 21:17:41.250476 containerd[1678]: time="2026-01-16T21:17:41.250423849Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 16 21:17:41.586580 containerd[1678]: time="2026-01-16T21:17:41.586409659Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 21:17:41.588305 containerd[1678]: time="2026-01-16T21:17:41.588264679Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 16 21:17:41.588412 containerd[1678]: time="2026-01-16T21:17:41.588371208Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 16 21:17:41.588666 kubelet[2882]: E0116 21:17:41.588560 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 16 21:17:41.588801 kubelet[2882]: E0116 21:17:41.588654 2882 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 16 21:17:41.588962 kubelet[2882]: E0116 21:17:41.588933 2882 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8wg29,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-kql5z_calico-system(a7efe392-df52-4d70-9d66-17372a93751c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 16 21:17:41.590282 kubelet[2882]: E0116 21:17:41.590237 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-kql5z" podUID="a7efe392-df52-4d70-9d66-17372a93751c" Jan 16 21:17:41.670022 containerd[1678]: time="2026-01-16T21:17:41.669917184Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6787d9fcf6-9lshv,Uid:ce4d12ad-802f-4a94-a69f-0be1648ae09b,Namespace:calico-apiserver,Attempt:0,}" Jan 16 21:17:41.670506 containerd[1678]: time="2026-01-16T21:17:41.670280310Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6787d9fcf6-mcbl2,Uid:eb6b7565-0bac-4e11-9918-28c46c9c8c58,Namespace:calico-apiserver,Attempt:0,}" Jan 16 21:17:41.671003 containerd[1678]: time="2026-01-16T21:17:41.670971362Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-mg6kr,Uid:1133c715-3a00-415e-9905-d5dd19553c44,Namespace:kube-system,Attempt:0,}" Jan 16 21:17:41.810997 kubelet[2882]: E0116 21:17:41.810928 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-56f896845d-kkjxr" podUID="91564960-b99d-4a80-9373-1c0eb8e68a7f" Jan 16 21:17:41.812005 kubelet[2882]: E0116 21:17:41.811980 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-kql5z" podUID="a7efe392-df52-4d70-9d66-17372a93751c" Jan 16 21:17:41.844632 systemd-networkd[1571]: cali5da0d7c6c6d: Link UP Jan 16 21:17:41.847023 systemd-networkd[1571]: cali5da0d7c6c6d: Gained carrier Jan 16 21:17:41.859627 containerd[1678]: 2026-01-16 21:17:41.745 [INFO][4519] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4580--0--0--p--be73a47b79-k8s-calico--apiserver--6787d9fcf6--9lshv-eth0 calico-apiserver-6787d9fcf6- calico-apiserver ce4d12ad-802f-4a94-a69f-0be1648ae09b 795 0 2026-01-16 21:17:11 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6787d9fcf6 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4580-0-0-p-be73a47b79 calico-apiserver-6787d9fcf6-9lshv eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali5da0d7c6c6d [] [] }} ContainerID="a030cb4c2bb03317ddb361badba3bf63a73614feb4a712245945124d601fea83" Namespace="calico-apiserver" Pod="calico-apiserver-6787d9fcf6-9lshv" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-calico--apiserver--6787d9fcf6--9lshv-" Jan 16 21:17:41.859627 containerd[1678]: 2026-01-16 21:17:41.746 [INFO][4519] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a030cb4c2bb03317ddb361badba3bf63a73614feb4a712245945124d601fea83" Namespace="calico-apiserver" Pod="calico-apiserver-6787d9fcf6-9lshv" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-calico--apiserver--6787d9fcf6--9lshv-eth0" Jan 16 21:17:41.859627 containerd[1678]: 2026-01-16 21:17:41.793 [INFO][4562] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a030cb4c2bb03317ddb361badba3bf63a73614feb4a712245945124d601fea83" HandleID="k8s-pod-network.a030cb4c2bb03317ddb361badba3bf63a73614feb4a712245945124d601fea83" Workload="ci--4580--0--0--p--be73a47b79-k8s-calico--apiserver--6787d9fcf6--9lshv-eth0" Jan 16 21:17:41.860262 containerd[1678]: 2026-01-16 21:17:41.793 [INFO][4562] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="a030cb4c2bb03317ddb361badba3bf63a73614feb4a712245945124d601fea83" HandleID="k8s-pod-network.a030cb4c2bb03317ddb361badba3bf63a73614feb4a712245945124d601fea83" Workload="ci--4580--0--0--p--be73a47b79-k8s-calico--apiserver--6787d9fcf6--9lshv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f120), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4580-0-0-p-be73a47b79", "pod":"calico-apiserver-6787d9fcf6-9lshv", "timestamp":"2026-01-16 21:17:41.793255801 +0000 UTC"}, Hostname:"ci-4580-0-0-p-be73a47b79", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 16 21:17:41.860262 containerd[1678]: 2026-01-16 21:17:41.794 [INFO][4562] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 16 21:17:41.860262 containerd[1678]: 2026-01-16 21:17:41.794 [INFO][4562] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 16 21:17:41.860262 containerd[1678]: 2026-01-16 21:17:41.796 [INFO][4562] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4580-0-0-p-be73a47b79' Jan 16 21:17:41.860262 containerd[1678]: 2026-01-16 21:17:41.804 [INFO][4562] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.a030cb4c2bb03317ddb361badba3bf63a73614feb4a712245945124d601fea83" host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:41.860262 containerd[1678]: 2026-01-16 21:17:41.807 [INFO][4562] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:41.860262 containerd[1678]: 2026-01-16 21:17:41.814 [INFO][4562] ipam/ipam.go 511: Trying affinity for 192.168.61.192/26 host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:41.860262 containerd[1678]: 2026-01-16 21:17:41.816 [INFO][4562] ipam/ipam.go 158: Attempting to load block cidr=192.168.61.192/26 host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:41.860262 containerd[1678]: 2026-01-16 21:17:41.819 [INFO][4562] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.61.192/26 host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:41.860955 containerd[1678]: 2026-01-16 21:17:41.819 [INFO][4562] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.61.192/26 handle="k8s-pod-network.a030cb4c2bb03317ddb361badba3bf63a73614feb4a712245945124d601fea83" host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:41.860955 containerd[1678]: 2026-01-16 21:17:41.822 [INFO][4562] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.a030cb4c2bb03317ddb361badba3bf63a73614feb4a712245945124d601fea83 Jan 16 21:17:41.860955 containerd[1678]: 2026-01-16 21:17:41.827 [INFO][4562] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.61.192/26 handle="k8s-pod-network.a030cb4c2bb03317ddb361badba3bf63a73614feb4a712245945124d601fea83" host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:41.860955 containerd[1678]: 2026-01-16 21:17:41.836 [INFO][4562] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.61.197/26] block=192.168.61.192/26 handle="k8s-pod-network.a030cb4c2bb03317ddb361badba3bf63a73614feb4a712245945124d601fea83" host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:41.860955 containerd[1678]: 2026-01-16 21:17:41.836 [INFO][4562] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.61.197/26] handle="k8s-pod-network.a030cb4c2bb03317ddb361badba3bf63a73614feb4a712245945124d601fea83" host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:41.860955 containerd[1678]: 2026-01-16 21:17:41.836 [INFO][4562] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 16 21:17:41.860955 containerd[1678]: 2026-01-16 21:17:41.836 [INFO][4562] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.61.197/26] IPv6=[] ContainerID="a030cb4c2bb03317ddb361badba3bf63a73614feb4a712245945124d601fea83" HandleID="k8s-pod-network.a030cb4c2bb03317ddb361badba3bf63a73614feb4a712245945124d601fea83" Workload="ci--4580--0--0--p--be73a47b79-k8s-calico--apiserver--6787d9fcf6--9lshv-eth0" Jan 16 21:17:41.861799 containerd[1678]: 2026-01-16 21:17:41.840 [INFO][4519] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a030cb4c2bb03317ddb361badba3bf63a73614feb4a712245945124d601fea83" Namespace="calico-apiserver" Pod="calico-apiserver-6787d9fcf6-9lshv" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-calico--apiserver--6787d9fcf6--9lshv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--be73a47b79-k8s-calico--apiserver--6787d9fcf6--9lshv-eth0", GenerateName:"calico-apiserver-6787d9fcf6-", Namespace:"calico-apiserver", SelfLink:"", UID:"ce4d12ad-802f-4a94-a69f-0be1648ae09b", ResourceVersion:"795", Generation:0, CreationTimestamp:time.Date(2026, time.January, 16, 21, 17, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6787d9fcf6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-be73a47b79", ContainerID:"", Pod:"calico-apiserver-6787d9fcf6-9lshv", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.61.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali5da0d7c6c6d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 16 21:17:41.861865 containerd[1678]: 2026-01-16 21:17:41.840 [INFO][4519] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.61.197/32] ContainerID="a030cb4c2bb03317ddb361badba3bf63a73614feb4a712245945124d601fea83" Namespace="calico-apiserver" Pod="calico-apiserver-6787d9fcf6-9lshv" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-calico--apiserver--6787d9fcf6--9lshv-eth0" Jan 16 21:17:41.861865 containerd[1678]: 2026-01-16 21:17:41.840 [INFO][4519] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5da0d7c6c6d ContainerID="a030cb4c2bb03317ddb361badba3bf63a73614feb4a712245945124d601fea83" Namespace="calico-apiserver" Pod="calico-apiserver-6787d9fcf6-9lshv" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-calico--apiserver--6787d9fcf6--9lshv-eth0" Jan 16 21:17:41.861865 containerd[1678]: 2026-01-16 21:17:41.847 [INFO][4519] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a030cb4c2bb03317ddb361badba3bf63a73614feb4a712245945124d601fea83" Namespace="calico-apiserver" Pod="calico-apiserver-6787d9fcf6-9lshv" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-calico--apiserver--6787d9fcf6--9lshv-eth0" Jan 16 21:17:41.861926 containerd[1678]: 2026-01-16 21:17:41.847 [INFO][4519] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a030cb4c2bb03317ddb361badba3bf63a73614feb4a712245945124d601fea83" Namespace="calico-apiserver" Pod="calico-apiserver-6787d9fcf6-9lshv" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-calico--apiserver--6787d9fcf6--9lshv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--be73a47b79-k8s-calico--apiserver--6787d9fcf6--9lshv-eth0", GenerateName:"calico-apiserver-6787d9fcf6-", Namespace:"calico-apiserver", SelfLink:"", UID:"ce4d12ad-802f-4a94-a69f-0be1648ae09b", ResourceVersion:"795", Generation:0, CreationTimestamp:time.Date(2026, time.January, 16, 21, 17, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6787d9fcf6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-be73a47b79", ContainerID:"a030cb4c2bb03317ddb361badba3bf63a73614feb4a712245945124d601fea83", Pod:"calico-apiserver-6787d9fcf6-9lshv", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.61.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali5da0d7c6c6d", MAC:"9a:98:07:87:1a:16", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 16 21:17:41.862003 containerd[1678]: 2026-01-16 21:17:41.857 [INFO][4519] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a030cb4c2bb03317ddb361badba3bf63a73614feb4a712245945124d601fea83" Namespace="calico-apiserver" Pod="calico-apiserver-6787d9fcf6-9lshv" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-calico--apiserver--6787d9fcf6--9lshv-eth0" Jan 16 21:17:41.871000 audit[4589]: NETFILTER_CFG table=filter:130 family=2 entries=62 op=nft_register_chain pid=4589 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 16 21:17:41.873201 kernel: kauditd_printk_skb: 128 callbacks suppressed Jan 16 21:17:41.873244 kernel: audit: type=1325 audit(1768598261.871:686): table=filter:130 family=2 entries=62 op=nft_register_chain pid=4589 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 16 21:17:41.871000 audit[4589]: SYSCALL arch=c000003e syscall=46 success=yes exit=31772 a0=3 a1=7ffd329693d0 a2=0 a3=7ffd329693bc items=0 ppid=4045 pid=4589 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:41.877111 kernel: audit: type=1300 audit(1768598261.871:686): arch=c000003e syscall=46 success=yes exit=31772 a0=3 a1=7ffd329693d0 a2=0 a3=7ffd329693bc items=0 ppid=4045 pid=4589 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:41.871000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 16 21:17:41.880917 kernel: audit: type=1327 audit(1768598261.871:686): proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 16 21:17:41.914085 containerd[1678]: time="2026-01-16T21:17:41.913832184Z" level=info msg="connecting to shim a030cb4c2bb03317ddb361badba3bf63a73614feb4a712245945124d601fea83" address="unix:///run/containerd/s/3a4a6436dace03675736d183dfcb87a9df716c262abf32279ccbec55bc400cff" namespace=k8s.io protocol=ttrpc version=3 Jan 16 21:17:41.945834 systemd[1]: Started cri-containerd-a030cb4c2bb03317ddb361badba3bf63a73614feb4a712245945124d601fea83.scope - libcontainer container a030cb4c2bb03317ddb361badba3bf63a73614feb4a712245945124d601fea83. Jan 16 21:17:41.954987 systemd-networkd[1571]: calicaba625dc96: Link UP Jan 16 21:17:41.955185 systemd-networkd[1571]: calicaba625dc96: Gained carrier Jan 16 21:17:41.975776 containerd[1678]: 2026-01-16 21:17:41.752 [INFO][4532] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4580--0--0--p--be73a47b79-k8s-calico--apiserver--6787d9fcf6--mcbl2-eth0 calico-apiserver-6787d9fcf6- calico-apiserver eb6b7565-0bac-4e11-9918-28c46c9c8c58 793 0 2026-01-16 21:17:11 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6787d9fcf6 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4580-0-0-p-be73a47b79 calico-apiserver-6787d9fcf6-mcbl2 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calicaba625dc96 [] [] }} ContainerID="aa4092b499bdff3fa6377d667dc74891b37f6ce328eb1f0e625fce63a1d51dde" Namespace="calico-apiserver" Pod="calico-apiserver-6787d9fcf6-mcbl2" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-calico--apiserver--6787d9fcf6--mcbl2-" Jan 16 21:17:41.975776 containerd[1678]: 2026-01-16 21:17:41.752 [INFO][4532] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="aa4092b499bdff3fa6377d667dc74891b37f6ce328eb1f0e625fce63a1d51dde" Namespace="calico-apiserver" Pod="calico-apiserver-6787d9fcf6-mcbl2" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-calico--apiserver--6787d9fcf6--mcbl2-eth0" Jan 16 21:17:41.975776 containerd[1678]: 2026-01-16 21:17:41.797 [INFO][4564] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="aa4092b499bdff3fa6377d667dc74891b37f6ce328eb1f0e625fce63a1d51dde" HandleID="k8s-pod-network.aa4092b499bdff3fa6377d667dc74891b37f6ce328eb1f0e625fce63a1d51dde" Workload="ci--4580--0--0--p--be73a47b79-k8s-calico--apiserver--6787d9fcf6--mcbl2-eth0" Jan 16 21:17:41.975971 containerd[1678]: 2026-01-16 21:17:41.797 [INFO][4564] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="aa4092b499bdff3fa6377d667dc74891b37f6ce328eb1f0e625fce63a1d51dde" HandleID="k8s-pod-network.aa4092b499bdff3fa6377d667dc74891b37f6ce328eb1f0e625fce63a1d51dde" Workload="ci--4580--0--0--p--be73a47b79-k8s-calico--apiserver--6787d9fcf6--mcbl2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5600), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4580-0-0-p-be73a47b79", "pod":"calico-apiserver-6787d9fcf6-mcbl2", "timestamp":"2026-01-16 21:17:41.797526001 +0000 UTC"}, Hostname:"ci-4580-0-0-p-be73a47b79", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 16 21:17:41.975971 containerd[1678]: 2026-01-16 21:17:41.797 [INFO][4564] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 16 21:17:41.975971 containerd[1678]: 2026-01-16 21:17:41.836 [INFO][4564] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 16 21:17:41.975971 containerd[1678]: 2026-01-16 21:17:41.836 [INFO][4564] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4580-0-0-p-be73a47b79' Jan 16 21:17:41.975971 containerd[1678]: 2026-01-16 21:17:41.905 [INFO][4564] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.aa4092b499bdff3fa6377d667dc74891b37f6ce328eb1f0e625fce63a1d51dde" host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:41.975971 containerd[1678]: 2026-01-16 21:17:41.912 [INFO][4564] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:41.975971 containerd[1678]: 2026-01-16 21:17:41.917 [INFO][4564] ipam/ipam.go 511: Trying affinity for 192.168.61.192/26 host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:41.975971 containerd[1678]: 2026-01-16 21:17:41.920 [INFO][4564] ipam/ipam.go 158: Attempting to load block cidr=192.168.61.192/26 host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:41.975971 containerd[1678]: 2026-01-16 21:17:41.925 [INFO][4564] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.61.192/26 host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:41.976161 containerd[1678]: 2026-01-16 21:17:41.926 [INFO][4564] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.61.192/26 handle="k8s-pod-network.aa4092b499bdff3fa6377d667dc74891b37f6ce328eb1f0e625fce63a1d51dde" host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:41.976161 containerd[1678]: 2026-01-16 21:17:41.927 [INFO][4564] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.aa4092b499bdff3fa6377d667dc74891b37f6ce328eb1f0e625fce63a1d51dde Jan 16 21:17:41.976161 containerd[1678]: 2026-01-16 21:17:41.935 [INFO][4564] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.61.192/26 handle="k8s-pod-network.aa4092b499bdff3fa6377d667dc74891b37f6ce328eb1f0e625fce63a1d51dde" host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:41.976161 containerd[1678]: 2026-01-16 21:17:41.945 [INFO][4564] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.61.198/26] block=192.168.61.192/26 handle="k8s-pod-network.aa4092b499bdff3fa6377d667dc74891b37f6ce328eb1f0e625fce63a1d51dde" host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:41.976161 containerd[1678]: 2026-01-16 21:17:41.945 [INFO][4564] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.61.198/26] handle="k8s-pod-network.aa4092b499bdff3fa6377d667dc74891b37f6ce328eb1f0e625fce63a1d51dde" host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:41.976161 containerd[1678]: 2026-01-16 21:17:41.945 [INFO][4564] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 16 21:17:41.976161 containerd[1678]: 2026-01-16 21:17:41.945 [INFO][4564] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.61.198/26] IPv6=[] ContainerID="aa4092b499bdff3fa6377d667dc74891b37f6ce328eb1f0e625fce63a1d51dde" HandleID="k8s-pod-network.aa4092b499bdff3fa6377d667dc74891b37f6ce328eb1f0e625fce63a1d51dde" Workload="ci--4580--0--0--p--be73a47b79-k8s-calico--apiserver--6787d9fcf6--mcbl2-eth0" Jan 16 21:17:41.976306 containerd[1678]: 2026-01-16 21:17:41.948 [INFO][4532] cni-plugin/k8s.go 418: Populated endpoint ContainerID="aa4092b499bdff3fa6377d667dc74891b37f6ce328eb1f0e625fce63a1d51dde" Namespace="calico-apiserver" Pod="calico-apiserver-6787d9fcf6-mcbl2" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-calico--apiserver--6787d9fcf6--mcbl2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--be73a47b79-k8s-calico--apiserver--6787d9fcf6--mcbl2-eth0", GenerateName:"calico-apiserver-6787d9fcf6-", Namespace:"calico-apiserver", SelfLink:"", UID:"eb6b7565-0bac-4e11-9918-28c46c9c8c58", ResourceVersion:"793", Generation:0, CreationTimestamp:time.Date(2026, time.January, 16, 21, 17, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6787d9fcf6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-be73a47b79", ContainerID:"", Pod:"calico-apiserver-6787d9fcf6-mcbl2", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.61.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calicaba625dc96", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 16 21:17:41.976358 containerd[1678]: 2026-01-16 21:17:41.948 [INFO][4532] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.61.198/32] ContainerID="aa4092b499bdff3fa6377d667dc74891b37f6ce328eb1f0e625fce63a1d51dde" Namespace="calico-apiserver" Pod="calico-apiserver-6787d9fcf6-mcbl2" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-calico--apiserver--6787d9fcf6--mcbl2-eth0" Jan 16 21:17:41.976358 containerd[1678]: 2026-01-16 21:17:41.948 [INFO][4532] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calicaba625dc96 ContainerID="aa4092b499bdff3fa6377d667dc74891b37f6ce328eb1f0e625fce63a1d51dde" Namespace="calico-apiserver" Pod="calico-apiserver-6787d9fcf6-mcbl2" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-calico--apiserver--6787d9fcf6--mcbl2-eth0" Jan 16 21:17:41.976358 containerd[1678]: 2026-01-16 21:17:41.956 [INFO][4532] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="aa4092b499bdff3fa6377d667dc74891b37f6ce328eb1f0e625fce63a1d51dde" Namespace="calico-apiserver" Pod="calico-apiserver-6787d9fcf6-mcbl2" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-calico--apiserver--6787d9fcf6--mcbl2-eth0" Jan 16 21:17:41.976418 containerd[1678]: 2026-01-16 21:17:41.958 [INFO][4532] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="aa4092b499bdff3fa6377d667dc74891b37f6ce328eb1f0e625fce63a1d51dde" Namespace="calico-apiserver" Pod="calico-apiserver-6787d9fcf6-mcbl2" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-calico--apiserver--6787d9fcf6--mcbl2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--be73a47b79-k8s-calico--apiserver--6787d9fcf6--mcbl2-eth0", GenerateName:"calico-apiserver-6787d9fcf6-", Namespace:"calico-apiserver", SelfLink:"", UID:"eb6b7565-0bac-4e11-9918-28c46c9c8c58", ResourceVersion:"793", Generation:0, CreationTimestamp:time.Date(2026, time.January, 16, 21, 17, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6787d9fcf6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-be73a47b79", ContainerID:"aa4092b499bdff3fa6377d667dc74891b37f6ce328eb1f0e625fce63a1d51dde", Pod:"calico-apiserver-6787d9fcf6-mcbl2", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.61.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calicaba625dc96", MAC:"d6:d3:09:79:6c:4d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 16 21:17:41.976467 containerd[1678]: 2026-01-16 21:17:41.973 [INFO][4532] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="aa4092b499bdff3fa6377d667dc74891b37f6ce328eb1f0e625fce63a1d51dde" Namespace="calico-apiserver" Pod="calico-apiserver-6787d9fcf6-mcbl2" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-calico--apiserver--6787d9fcf6--mcbl2-eth0" Jan 16 21:17:41.988000 audit: BPF prog-id=226 op=LOAD Jan 16 21:17:41.990662 kernel: audit: type=1334 audit(1768598261.988:687): prog-id=226 op=LOAD Jan 16 21:17:41.990000 audit: BPF prog-id=227 op=LOAD Jan 16 21:17:41.992612 kernel: audit: type=1334 audit(1768598261.990:688): prog-id=227 op=LOAD Jan 16 21:17:41.993401 kernel: audit: type=1300 audit(1768598261.990:688): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=4599 pid=4611 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:41.990000 audit[4611]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=4599 pid=4611 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:41.990000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6130333063623463326262303333313764646233363162616462613362 Jan 16 21:17:42.002610 kernel: audit: type=1327 audit(1768598261.990:688): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6130333063623463326262303333313764646233363162616462613362 Jan 16 21:17:42.002661 kernel: audit: type=1334 audit(1768598261.990:689): prog-id=227 op=UNLOAD Jan 16 21:17:41.990000 audit: BPF prog-id=227 op=UNLOAD Jan 16 21:17:41.990000 audit[4611]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4599 pid=4611 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:42.004930 kernel: audit: type=1300 audit(1768598261.990:689): arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4599 pid=4611 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:42.013263 systemd-networkd[1571]: calif10188df8cb: Gained IPv6LL Jan 16 21:17:41.990000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6130333063623463326262303333313764646233363162616462613362 Jan 16 21:17:41.990000 audit: BPF prog-id=228 op=LOAD Jan 16 21:17:41.990000 audit[4611]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=4599 pid=4611 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:41.990000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6130333063623463326262303333313764646233363162616462613362 Jan 16 21:17:41.990000 audit: BPF prog-id=229 op=LOAD Jan 16 21:17:41.990000 audit[4611]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=4599 pid=4611 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:41.990000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6130333063623463326262303333313764646233363162616462613362 Jan 16 21:17:41.990000 audit: BPF prog-id=229 op=UNLOAD Jan 16 21:17:42.018164 kernel: audit: type=1327 audit(1768598261.990:689): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6130333063623463326262303333313764646233363162616462613362 Jan 16 21:17:41.990000 audit[4611]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4599 pid=4611 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:41.990000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6130333063623463326262303333313764646233363162616462613362 Jan 16 21:17:41.990000 audit: BPF prog-id=228 op=UNLOAD Jan 16 21:17:41.990000 audit[4611]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4599 pid=4611 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:41.990000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6130333063623463326262303333313764646233363162616462613362 Jan 16 21:17:41.990000 audit: BPF prog-id=230 op=LOAD Jan 16 21:17:41.990000 audit[4611]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=4599 pid=4611 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:41.990000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6130333063623463326262303333313764646233363162616462613362 Jan 16 21:17:41.992000 audit[4638]: NETFILTER_CFG table=filter:131 family=2 entries=53 op=nft_register_chain pid=4638 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 16 21:17:41.992000 audit[4638]: SYSCALL arch=c000003e syscall=46 success=yes exit=26640 a0=3 a1=7ffef1578f10 a2=0 a3=7ffef1578efc items=0 ppid=4045 pid=4638 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:41.992000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 16 21:17:42.029758 containerd[1678]: time="2026-01-16T21:17:42.029718997Z" level=info msg="connecting to shim aa4092b499bdff3fa6377d667dc74891b37f6ce328eb1f0e625fce63a1d51dde" address="unix:///run/containerd/s/ae9ca68ce28bd830267b45b2a0ed31d9ee3d752de2d89d16ff18203af15384a6" namespace=k8s.io protocol=ttrpc version=3 Jan 16 21:17:42.078943 systemd[1]: Started cri-containerd-aa4092b499bdff3fa6377d667dc74891b37f6ce328eb1f0e625fce63a1d51dde.scope - libcontainer container aa4092b499bdff3fa6377d667dc74891b37f6ce328eb1f0e625fce63a1d51dde. Jan 16 21:17:42.086040 systemd-networkd[1571]: calib61b1616a02: Link UP Jan 16 21:17:42.090011 containerd[1678]: time="2026-01-16T21:17:42.088937056Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6787d9fcf6-9lshv,Uid:ce4d12ad-802f-4a94-a69f-0be1648ae09b,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"a030cb4c2bb03317ddb361badba3bf63a73614feb4a712245945124d601fea83\"" Jan 16 21:17:42.091902 systemd-networkd[1571]: calib61b1616a02: Gained carrier Jan 16 21:17:42.098129 containerd[1678]: time="2026-01-16T21:17:42.095140378Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 16 21:17:42.108000 audit: BPF prog-id=231 op=LOAD Jan 16 21:17:42.109000 audit: BPF prog-id=232 op=LOAD Jan 16 21:17:42.109000 audit[4659]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000138238 a2=98 a3=0 items=0 ppid=4648 pid=4659 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:42.109000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6161343039326234393962646666336661363337376436363764633734 Jan 16 21:17:42.110000 audit: BPF prog-id=232 op=UNLOAD Jan 16 21:17:42.110000 audit[4659]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4648 pid=4659 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:42.110000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6161343039326234393962646666336661363337376436363764633734 Jan 16 21:17:42.110000 audit: BPF prog-id=233 op=LOAD Jan 16 21:17:42.110000 audit[4659]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000138488 a2=98 a3=0 items=0 ppid=4648 pid=4659 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:42.110000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6161343039326234393962646666336661363337376436363764633734 Jan 16 21:17:42.110000 audit: BPF prog-id=234 op=LOAD Jan 16 21:17:42.110000 audit[4659]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000138218 a2=98 a3=0 items=0 ppid=4648 pid=4659 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:42.110000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6161343039326234393962646666336661363337376436363764633734 Jan 16 21:17:42.110000 audit: BPF prog-id=234 op=UNLOAD Jan 16 21:17:42.110000 audit[4659]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4648 pid=4659 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:42.110000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6161343039326234393962646666336661363337376436363764633734 Jan 16 21:17:42.110000 audit: BPF prog-id=233 op=UNLOAD Jan 16 21:17:42.110000 audit[4659]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4648 pid=4659 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:42.110000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6161343039326234393962646666336661363337376436363764633734 Jan 16 21:17:42.110000 audit: BPF prog-id=235 op=LOAD Jan 16 21:17:42.110000 audit[4659]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001386e8 a2=98 a3=0 items=0 ppid=4648 pid=4659 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:42.110000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6161343039326234393962646666336661363337376436363764633734 Jan 16 21:17:42.113054 containerd[1678]: 2026-01-16 21:17:41.769 [INFO][4541] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4580--0--0--p--be73a47b79-k8s-coredns--668d6bf9bc--mg6kr-eth0 coredns-668d6bf9bc- kube-system 1133c715-3a00-415e-9905-d5dd19553c44 789 0 2026-01-16 21:17:01 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4580-0-0-p-be73a47b79 coredns-668d6bf9bc-mg6kr eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calib61b1616a02 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="e750022f468149ff6c67b74a92471b9c89f5ed5485da90ee989792bf9a123227" Namespace="kube-system" Pod="coredns-668d6bf9bc-mg6kr" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-coredns--668d6bf9bc--mg6kr-" Jan 16 21:17:42.113054 containerd[1678]: 2026-01-16 21:17:41.769 [INFO][4541] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e750022f468149ff6c67b74a92471b9c89f5ed5485da90ee989792bf9a123227" Namespace="kube-system" Pod="coredns-668d6bf9bc-mg6kr" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-coredns--668d6bf9bc--mg6kr-eth0" Jan 16 21:17:42.113054 containerd[1678]: 2026-01-16 21:17:41.800 [INFO][4573] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e750022f468149ff6c67b74a92471b9c89f5ed5485da90ee989792bf9a123227" HandleID="k8s-pod-network.e750022f468149ff6c67b74a92471b9c89f5ed5485da90ee989792bf9a123227" Workload="ci--4580--0--0--p--be73a47b79-k8s-coredns--668d6bf9bc--mg6kr-eth0" Jan 16 21:17:42.113216 containerd[1678]: 2026-01-16 21:17:41.800 [INFO][4573] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="e750022f468149ff6c67b74a92471b9c89f5ed5485da90ee989792bf9a123227" HandleID="k8s-pod-network.e750022f468149ff6c67b74a92471b9c89f5ed5485da90ee989792bf9a123227" Workload="ci--4580--0--0--p--be73a47b79-k8s-coredns--668d6bf9bc--mg6kr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d57c0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4580-0-0-p-be73a47b79", "pod":"coredns-668d6bf9bc-mg6kr", "timestamp":"2026-01-16 21:17:41.800664054 +0000 UTC"}, Hostname:"ci-4580-0-0-p-be73a47b79", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 16 21:17:42.113216 containerd[1678]: 2026-01-16 21:17:41.800 [INFO][4573] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 16 21:17:42.113216 containerd[1678]: 2026-01-16 21:17:41.945 [INFO][4573] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 16 21:17:42.113216 containerd[1678]: 2026-01-16 21:17:41.945 [INFO][4573] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4580-0-0-p-be73a47b79' Jan 16 21:17:42.113216 containerd[1678]: 2026-01-16 21:17:42.011 [INFO][4573] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.e750022f468149ff6c67b74a92471b9c89f5ed5485da90ee989792bf9a123227" host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:42.113216 containerd[1678]: 2026-01-16 21:17:42.036 [INFO][4573] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:42.113216 containerd[1678]: 2026-01-16 21:17:42.042 [INFO][4573] ipam/ipam.go 511: Trying affinity for 192.168.61.192/26 host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:42.113216 containerd[1678]: 2026-01-16 21:17:42.046 [INFO][4573] ipam/ipam.go 158: Attempting to load block cidr=192.168.61.192/26 host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:42.113216 containerd[1678]: 2026-01-16 21:17:42.049 [INFO][4573] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.61.192/26 host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:42.113398 containerd[1678]: 2026-01-16 21:17:42.050 [INFO][4573] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.61.192/26 handle="k8s-pod-network.e750022f468149ff6c67b74a92471b9c89f5ed5485da90ee989792bf9a123227" host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:42.113398 containerd[1678]: 2026-01-16 21:17:42.055 [INFO][4573] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.e750022f468149ff6c67b74a92471b9c89f5ed5485da90ee989792bf9a123227 Jan 16 21:17:42.113398 containerd[1678]: 2026-01-16 21:17:42.061 [INFO][4573] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.61.192/26 handle="k8s-pod-network.e750022f468149ff6c67b74a92471b9c89f5ed5485da90ee989792bf9a123227" host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:42.113398 containerd[1678]: 2026-01-16 21:17:42.072 [INFO][4573] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.61.199/26] block=192.168.61.192/26 handle="k8s-pod-network.e750022f468149ff6c67b74a92471b9c89f5ed5485da90ee989792bf9a123227" host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:42.113398 containerd[1678]: 2026-01-16 21:17:42.074 [INFO][4573] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.61.199/26] handle="k8s-pod-network.e750022f468149ff6c67b74a92471b9c89f5ed5485da90ee989792bf9a123227" host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:42.113398 containerd[1678]: 2026-01-16 21:17:42.074 [INFO][4573] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 16 21:17:42.113398 containerd[1678]: 2026-01-16 21:17:42.074 [INFO][4573] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.61.199/26] IPv6=[] ContainerID="e750022f468149ff6c67b74a92471b9c89f5ed5485da90ee989792bf9a123227" HandleID="k8s-pod-network.e750022f468149ff6c67b74a92471b9c89f5ed5485da90ee989792bf9a123227" Workload="ci--4580--0--0--p--be73a47b79-k8s-coredns--668d6bf9bc--mg6kr-eth0" Jan 16 21:17:42.113529 containerd[1678]: 2026-01-16 21:17:42.076 [INFO][4541] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e750022f468149ff6c67b74a92471b9c89f5ed5485da90ee989792bf9a123227" Namespace="kube-system" Pod="coredns-668d6bf9bc-mg6kr" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-coredns--668d6bf9bc--mg6kr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--be73a47b79-k8s-coredns--668d6bf9bc--mg6kr-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"1133c715-3a00-415e-9905-d5dd19553c44", ResourceVersion:"789", Generation:0, CreationTimestamp:time.Date(2026, time.January, 16, 21, 17, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-be73a47b79", ContainerID:"", Pod:"coredns-668d6bf9bc-mg6kr", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.61.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib61b1616a02", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 16 21:17:42.113529 containerd[1678]: 2026-01-16 21:17:42.076 [INFO][4541] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.61.199/32] ContainerID="e750022f468149ff6c67b74a92471b9c89f5ed5485da90ee989792bf9a123227" Namespace="kube-system" Pod="coredns-668d6bf9bc-mg6kr" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-coredns--668d6bf9bc--mg6kr-eth0" Jan 16 21:17:42.113529 containerd[1678]: 2026-01-16 21:17:42.076 [INFO][4541] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib61b1616a02 ContainerID="e750022f468149ff6c67b74a92471b9c89f5ed5485da90ee989792bf9a123227" Namespace="kube-system" Pod="coredns-668d6bf9bc-mg6kr" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-coredns--668d6bf9bc--mg6kr-eth0" Jan 16 21:17:42.113529 containerd[1678]: 2026-01-16 21:17:42.095 [INFO][4541] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e750022f468149ff6c67b74a92471b9c89f5ed5485da90ee989792bf9a123227" Namespace="kube-system" Pod="coredns-668d6bf9bc-mg6kr" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-coredns--668d6bf9bc--mg6kr-eth0" Jan 16 21:17:42.113529 containerd[1678]: 2026-01-16 21:17:42.095 [INFO][4541] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e750022f468149ff6c67b74a92471b9c89f5ed5485da90ee989792bf9a123227" Namespace="kube-system" Pod="coredns-668d6bf9bc-mg6kr" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-coredns--668d6bf9bc--mg6kr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--be73a47b79-k8s-coredns--668d6bf9bc--mg6kr-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"1133c715-3a00-415e-9905-d5dd19553c44", ResourceVersion:"789", Generation:0, CreationTimestamp:time.Date(2026, time.January, 16, 21, 17, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-be73a47b79", ContainerID:"e750022f468149ff6c67b74a92471b9c89f5ed5485da90ee989792bf9a123227", Pod:"coredns-668d6bf9bc-mg6kr", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.61.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib61b1616a02", MAC:"92:0e:84:ac:03:1d", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 16 21:17:42.113529 containerd[1678]: 2026-01-16 21:17:42.109 [INFO][4541] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e750022f468149ff6c67b74a92471b9c89f5ed5485da90ee989792bf9a123227" Namespace="kube-system" Pod="coredns-668d6bf9bc-mg6kr" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-coredns--668d6bf9bc--mg6kr-eth0" Jan 16 21:17:42.152749 containerd[1678]: time="2026-01-16T21:17:42.152669517Z" level=info msg="connecting to shim e750022f468149ff6c67b74a92471b9c89f5ed5485da90ee989792bf9a123227" address="unix:///run/containerd/s/3a4dcb8f05f502246068372a6f28fa1dede7b3d288ada0f0bd97fe0e1560c03c" namespace=k8s.io protocol=ttrpc version=3 Jan 16 21:17:42.158000 audit[4703]: NETFILTER_CFG table=filter:132 family=2 entries=68 op=nft_register_chain pid=4703 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 16 21:17:42.158000 audit[4703]: SYSCALL arch=c000003e syscall=46 success=yes exit=31344 a0=3 a1=7fffa0f29620 a2=0 a3=7fffa0f2960c items=0 ppid=4045 pid=4703 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:42.158000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 16 21:17:42.178907 systemd[1]: Started cri-containerd-e750022f468149ff6c67b74a92471b9c89f5ed5485da90ee989792bf9a123227.scope - libcontainer container e750022f468149ff6c67b74a92471b9c89f5ed5485da90ee989792bf9a123227. Jan 16 21:17:42.194000 audit: BPF prog-id=236 op=LOAD Jan 16 21:17:42.195000 audit: BPF prog-id=237 op=LOAD Jan 16 21:17:42.195000 audit[4714]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00010c238 a2=98 a3=0 items=0 ppid=4701 pid=4714 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:42.195000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6537353030323266343638313439666636633637623734613932343731 Jan 16 21:17:42.195000 audit: BPF prog-id=237 op=UNLOAD Jan 16 21:17:42.195000 audit[4714]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4701 pid=4714 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:42.195000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6537353030323266343638313439666636633637623734613932343731 Jan 16 21:17:42.195000 audit: BPF prog-id=238 op=LOAD Jan 16 21:17:42.195000 audit[4714]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00010c488 a2=98 a3=0 items=0 ppid=4701 pid=4714 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:42.195000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6537353030323266343638313439666636633637623734613932343731 Jan 16 21:17:42.195000 audit: BPF prog-id=239 op=LOAD Jan 16 21:17:42.195000 audit[4714]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00010c218 a2=98 a3=0 items=0 ppid=4701 pid=4714 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:42.195000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6537353030323266343638313439666636633637623734613932343731 Jan 16 21:17:42.195000 audit: BPF prog-id=239 op=UNLOAD Jan 16 21:17:42.195000 audit[4714]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4701 pid=4714 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:42.195000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6537353030323266343638313439666636633637623734613932343731 Jan 16 21:17:42.195000 audit: BPF prog-id=238 op=UNLOAD Jan 16 21:17:42.195000 audit[4714]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4701 pid=4714 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:42.195000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6537353030323266343638313439666636633637623734613932343731 Jan 16 21:17:42.197000 audit: BPF prog-id=240 op=LOAD Jan 16 21:17:42.197000 audit[4714]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00010c6e8 a2=98 a3=0 items=0 ppid=4701 pid=4714 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:42.197000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6537353030323266343638313439666636633637623734613932343731 Jan 16 21:17:42.203325 containerd[1678]: time="2026-01-16T21:17:42.203294816Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6787d9fcf6-mcbl2,Uid:eb6b7565-0bac-4e11-9918-28c46c9c8c58,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"aa4092b499bdff3fa6377d667dc74891b37f6ce328eb1f0e625fce63a1d51dde\"" Jan 16 21:17:42.237262 containerd[1678]: time="2026-01-16T21:17:42.237216180Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-mg6kr,Uid:1133c715-3a00-415e-9905-d5dd19553c44,Namespace:kube-system,Attempt:0,} returns sandbox id \"e750022f468149ff6c67b74a92471b9c89f5ed5485da90ee989792bf9a123227\"" Jan 16 21:17:42.240858 containerd[1678]: time="2026-01-16T21:17:42.240821794Z" level=info msg="CreateContainer within sandbox \"e750022f468149ff6c67b74a92471b9c89f5ed5485da90ee989792bf9a123227\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 16 21:17:42.267132 containerd[1678]: time="2026-01-16T21:17:42.266882474Z" level=info msg="Container 006d154c990265b1911d54a8fccfe17854cb21ced5f391a310a00ca731f1a7b0: CDI devices from CRI Config.CDIDevices: []" Jan 16 21:17:42.280016 containerd[1678]: time="2026-01-16T21:17:42.279984545Z" level=info msg="CreateContainer within sandbox \"e750022f468149ff6c67b74a92471b9c89f5ed5485da90ee989792bf9a123227\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"006d154c990265b1911d54a8fccfe17854cb21ced5f391a310a00ca731f1a7b0\"" Jan 16 21:17:42.280967 containerd[1678]: time="2026-01-16T21:17:42.280818706Z" level=info msg="StartContainer for \"006d154c990265b1911d54a8fccfe17854cb21ced5f391a310a00ca731f1a7b0\"" Jan 16 21:17:42.282638 containerd[1678]: time="2026-01-16T21:17:42.282512180Z" level=info msg="connecting to shim 006d154c990265b1911d54a8fccfe17854cb21ced5f391a310a00ca731f1a7b0" address="unix:///run/containerd/s/3a4dcb8f05f502246068372a6f28fa1dede7b3d288ada0f0bd97fe0e1560c03c" protocol=ttrpc version=3 Jan 16 21:17:42.304785 systemd[1]: Started cri-containerd-006d154c990265b1911d54a8fccfe17854cb21ced5f391a310a00ca731f1a7b0.scope - libcontainer container 006d154c990265b1911d54a8fccfe17854cb21ced5f391a310a00ca731f1a7b0. Jan 16 21:17:42.317000 audit: BPF prog-id=241 op=LOAD Jan 16 21:17:42.318000 audit: BPF prog-id=242 op=LOAD Jan 16 21:17:42.318000 audit[4747]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=4701 pid=4747 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:42.318000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3030366431353463393930323635623139313164353461386663636665 Jan 16 21:17:42.318000 audit: BPF prog-id=242 op=UNLOAD Jan 16 21:17:42.318000 audit[4747]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4701 pid=4747 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:42.318000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3030366431353463393930323635623139313164353461386663636665 Jan 16 21:17:42.318000 audit: BPF prog-id=243 op=LOAD Jan 16 21:17:42.318000 audit[4747]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=4701 pid=4747 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:42.318000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3030366431353463393930323635623139313164353461386663636665 Jan 16 21:17:42.319000 audit: BPF prog-id=244 op=LOAD Jan 16 21:17:42.319000 audit[4747]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=4701 pid=4747 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:42.319000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3030366431353463393930323635623139313164353461386663636665 Jan 16 21:17:42.319000 audit: BPF prog-id=244 op=UNLOAD Jan 16 21:17:42.319000 audit[4747]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4701 pid=4747 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:42.319000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3030366431353463393930323635623139313164353461386663636665 Jan 16 21:17:42.319000 audit: BPF prog-id=243 op=UNLOAD Jan 16 21:17:42.319000 audit[4747]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4701 pid=4747 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:42.319000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3030366431353463393930323635623139313164353461386663636665 Jan 16 21:17:42.319000 audit: BPF prog-id=245 op=LOAD Jan 16 21:17:42.319000 audit[4747]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=4701 pid=4747 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:42.319000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3030366431353463393930323635623139313164353461386663636665 Jan 16 21:17:42.342508 containerd[1678]: time="2026-01-16T21:17:42.342446720Z" level=info msg="StartContainer for \"006d154c990265b1911d54a8fccfe17854cb21ced5f391a310a00ca731f1a7b0\" returns successfully" Jan 16 21:17:42.454091 containerd[1678]: time="2026-01-16T21:17:42.453882968Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 21:17:42.455617 containerd[1678]: time="2026-01-16T21:17:42.455524240Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 16 21:17:42.455705 containerd[1678]: time="2026-01-16T21:17:42.455686168Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 16 21:17:42.456057 kubelet[2882]: E0116 21:17:42.456002 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 16 21:17:42.457711 kubelet[2882]: E0116 21:17:42.456065 2882 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 16 21:17:42.457711 kubelet[2882]: E0116 21:17:42.456330 2882 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xdsvg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-6787d9fcf6-9lshv_calico-apiserver(ce4d12ad-802f-4a94-a69f-0be1648ae09b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 16 21:17:42.457886 containerd[1678]: time="2026-01-16T21:17:42.456571853Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 16 21:17:42.457983 kubelet[2882]: E0116 21:17:42.457964 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6787d9fcf6-9lshv" podUID="ce4d12ad-802f-4a94-a69f-0be1648ae09b" Jan 16 21:17:42.670353 containerd[1678]: time="2026-01-16T21:17:42.670314958Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-sflq7,Uid:88cd36cc-6fbb-4a49-8bfe-3b87f533604b,Namespace:kube-system,Attempt:0,}" Jan 16 21:17:42.768693 systemd-networkd[1571]: cali99b974c08e2: Link UP Jan 16 21:17:42.769594 systemd-networkd[1571]: cali99b974c08e2: Gained carrier Jan 16 21:17:42.787933 containerd[1678]: 2026-01-16 21:17:42.710 [INFO][4777] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4580--0--0--p--be73a47b79-k8s-coredns--668d6bf9bc--sflq7-eth0 coredns-668d6bf9bc- kube-system 88cd36cc-6fbb-4a49-8bfe-3b87f533604b 794 0 2026-01-16 21:17:01 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4580-0-0-p-be73a47b79 coredns-668d6bf9bc-sflq7 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali99b974c08e2 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="d4129898ca66c6d1ad116bff62d7a285defda16bd11d9a2780eafdf6273263d7" Namespace="kube-system" Pod="coredns-668d6bf9bc-sflq7" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-coredns--668d6bf9bc--sflq7-" Jan 16 21:17:42.787933 containerd[1678]: 2026-01-16 21:17:42.710 [INFO][4777] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d4129898ca66c6d1ad116bff62d7a285defda16bd11d9a2780eafdf6273263d7" Namespace="kube-system" Pod="coredns-668d6bf9bc-sflq7" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-coredns--668d6bf9bc--sflq7-eth0" Jan 16 21:17:42.787933 containerd[1678]: 2026-01-16 21:17:42.736 [INFO][4790] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d4129898ca66c6d1ad116bff62d7a285defda16bd11d9a2780eafdf6273263d7" HandleID="k8s-pod-network.d4129898ca66c6d1ad116bff62d7a285defda16bd11d9a2780eafdf6273263d7" Workload="ci--4580--0--0--p--be73a47b79-k8s-coredns--668d6bf9bc--sflq7-eth0" Jan 16 21:17:42.787933 containerd[1678]: 2026-01-16 21:17:42.736 [INFO][4790] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="d4129898ca66c6d1ad116bff62d7a285defda16bd11d9a2780eafdf6273263d7" HandleID="k8s-pod-network.d4129898ca66c6d1ad116bff62d7a285defda16bd11d9a2780eafdf6273263d7" Workload="ci--4580--0--0--p--be73a47b79-k8s-coredns--668d6bf9bc--sflq7-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5030), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4580-0-0-p-be73a47b79", "pod":"coredns-668d6bf9bc-sflq7", "timestamp":"2026-01-16 21:17:42.736132839 +0000 UTC"}, Hostname:"ci-4580-0-0-p-be73a47b79", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 16 21:17:42.787933 containerd[1678]: 2026-01-16 21:17:42.736 [INFO][4790] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 16 21:17:42.787933 containerd[1678]: 2026-01-16 21:17:42.736 [INFO][4790] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 16 21:17:42.787933 containerd[1678]: 2026-01-16 21:17:42.736 [INFO][4790] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4580-0-0-p-be73a47b79' Jan 16 21:17:42.787933 containerd[1678]: 2026-01-16 21:17:42.741 [INFO][4790] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d4129898ca66c6d1ad116bff62d7a285defda16bd11d9a2780eafdf6273263d7" host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:42.787933 containerd[1678]: 2026-01-16 21:17:42.745 [INFO][4790] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:42.787933 containerd[1678]: 2026-01-16 21:17:42.748 [INFO][4790] ipam/ipam.go 511: Trying affinity for 192.168.61.192/26 host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:42.787933 containerd[1678]: 2026-01-16 21:17:42.749 [INFO][4790] ipam/ipam.go 158: Attempting to load block cidr=192.168.61.192/26 host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:42.787933 containerd[1678]: 2026-01-16 21:17:42.751 [INFO][4790] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.61.192/26 host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:42.787933 containerd[1678]: 2026-01-16 21:17:42.751 [INFO][4790] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.61.192/26 handle="k8s-pod-network.d4129898ca66c6d1ad116bff62d7a285defda16bd11d9a2780eafdf6273263d7" host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:42.787933 containerd[1678]: 2026-01-16 21:17:42.755 [INFO][4790] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.d4129898ca66c6d1ad116bff62d7a285defda16bd11d9a2780eafdf6273263d7 Jan 16 21:17:42.787933 containerd[1678]: 2026-01-16 21:17:42.758 [INFO][4790] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.61.192/26 handle="k8s-pod-network.d4129898ca66c6d1ad116bff62d7a285defda16bd11d9a2780eafdf6273263d7" host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:42.787933 containerd[1678]: 2026-01-16 21:17:42.764 [INFO][4790] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.61.200/26] block=192.168.61.192/26 handle="k8s-pod-network.d4129898ca66c6d1ad116bff62d7a285defda16bd11d9a2780eafdf6273263d7" host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:42.787933 containerd[1678]: 2026-01-16 21:17:42.764 [INFO][4790] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.61.200/26] handle="k8s-pod-network.d4129898ca66c6d1ad116bff62d7a285defda16bd11d9a2780eafdf6273263d7" host="ci-4580-0-0-p-be73a47b79" Jan 16 21:17:42.787933 containerd[1678]: 2026-01-16 21:17:42.764 [INFO][4790] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 16 21:17:42.787933 containerd[1678]: 2026-01-16 21:17:42.764 [INFO][4790] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.61.200/26] IPv6=[] ContainerID="d4129898ca66c6d1ad116bff62d7a285defda16bd11d9a2780eafdf6273263d7" HandleID="k8s-pod-network.d4129898ca66c6d1ad116bff62d7a285defda16bd11d9a2780eafdf6273263d7" Workload="ci--4580--0--0--p--be73a47b79-k8s-coredns--668d6bf9bc--sflq7-eth0" Jan 16 21:17:42.788445 containerd[1678]: 2026-01-16 21:17:42.766 [INFO][4777] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d4129898ca66c6d1ad116bff62d7a285defda16bd11d9a2780eafdf6273263d7" Namespace="kube-system" Pod="coredns-668d6bf9bc-sflq7" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-coredns--668d6bf9bc--sflq7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--be73a47b79-k8s-coredns--668d6bf9bc--sflq7-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"88cd36cc-6fbb-4a49-8bfe-3b87f533604b", ResourceVersion:"794", Generation:0, CreationTimestamp:time.Date(2026, time.January, 16, 21, 17, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-be73a47b79", ContainerID:"", Pod:"coredns-668d6bf9bc-sflq7", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.61.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali99b974c08e2", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 16 21:17:42.788445 containerd[1678]: 2026-01-16 21:17:42.766 [INFO][4777] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.61.200/32] ContainerID="d4129898ca66c6d1ad116bff62d7a285defda16bd11d9a2780eafdf6273263d7" Namespace="kube-system" Pod="coredns-668d6bf9bc-sflq7" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-coredns--668d6bf9bc--sflq7-eth0" Jan 16 21:17:42.788445 containerd[1678]: 2026-01-16 21:17:42.766 [INFO][4777] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali99b974c08e2 ContainerID="d4129898ca66c6d1ad116bff62d7a285defda16bd11d9a2780eafdf6273263d7" Namespace="kube-system" Pod="coredns-668d6bf9bc-sflq7" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-coredns--668d6bf9bc--sflq7-eth0" Jan 16 21:17:42.788445 containerd[1678]: 2026-01-16 21:17:42.770 [INFO][4777] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d4129898ca66c6d1ad116bff62d7a285defda16bd11d9a2780eafdf6273263d7" Namespace="kube-system" Pod="coredns-668d6bf9bc-sflq7" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-coredns--668d6bf9bc--sflq7-eth0" Jan 16 21:17:42.788445 containerd[1678]: 2026-01-16 21:17:42.770 [INFO][4777] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d4129898ca66c6d1ad116bff62d7a285defda16bd11d9a2780eafdf6273263d7" Namespace="kube-system" Pod="coredns-668d6bf9bc-sflq7" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-coredns--668d6bf9bc--sflq7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--be73a47b79-k8s-coredns--668d6bf9bc--sflq7-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"88cd36cc-6fbb-4a49-8bfe-3b87f533604b", ResourceVersion:"794", Generation:0, CreationTimestamp:time.Date(2026, time.January, 16, 21, 17, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-be73a47b79", ContainerID:"d4129898ca66c6d1ad116bff62d7a285defda16bd11d9a2780eafdf6273263d7", Pod:"coredns-668d6bf9bc-sflq7", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.61.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali99b974c08e2", MAC:"fe:63:c6:59:ff:75", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 16 21:17:42.788445 containerd[1678]: 2026-01-16 21:17:42.783 [INFO][4777] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d4129898ca66c6d1ad116bff62d7a285defda16bd11d9a2780eafdf6273263d7" Namespace="kube-system" Pod="coredns-668d6bf9bc-sflq7" WorkloadEndpoint="ci--4580--0--0--p--be73a47b79-k8s-coredns--668d6bf9bc--sflq7-eth0" Jan 16 21:17:42.796862 containerd[1678]: time="2026-01-16T21:17:42.796826059Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 21:17:42.798203 containerd[1678]: time="2026-01-16T21:17:42.798091243Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 16 21:17:42.798203 containerd[1678]: time="2026-01-16T21:17:42.798157598Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 16 21:17:42.798611 kubelet[2882]: E0116 21:17:42.798445 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 16 21:17:42.798611 kubelet[2882]: E0116 21:17:42.798502 2882 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 16 21:17:42.798759 kubelet[2882]: E0116 21:17:42.798721 2882 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-72sgl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-6787d9fcf6-mcbl2_calico-apiserver(eb6b7565-0bac-4e11-9918-28c46c9c8c58): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 16 21:17:42.800251 kubelet[2882]: E0116 21:17:42.800212 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6787d9fcf6-mcbl2" podUID="eb6b7565-0bac-4e11-9918-28c46c9c8c58" Jan 16 21:17:42.816391 kubelet[2882]: E0116 21:17:42.816352 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6787d9fcf6-mcbl2" podUID="eb6b7565-0bac-4e11-9918-28c46c9c8c58" Jan 16 21:17:42.819280 kubelet[2882]: E0116 21:17:42.819163 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6787d9fcf6-9lshv" podUID="ce4d12ad-802f-4a94-a69f-0be1648ae09b" Jan 16 21:17:42.826356 containerd[1678]: time="2026-01-16T21:17:42.826321155Z" level=info msg="connecting to shim d4129898ca66c6d1ad116bff62d7a285defda16bd11d9a2780eafdf6273263d7" address="unix:///run/containerd/s/c91d480ee90aa020e68362d75d0db106f6b46fa6f0a67ff2772d19d62a34bce8" namespace=k8s.io protocol=ttrpc version=3 Jan 16 21:17:42.828026 kubelet[2882]: E0116 21:17:42.827972 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-kql5z" podUID="a7efe392-df52-4d70-9d66-17372a93751c" Jan 16 21:17:42.845000 audit[4820]: NETFILTER_CFG table=filter:133 family=2 entries=52 op=nft_register_chain pid=4820 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 16 21:17:42.845000 audit[4820]: SYSCALL arch=c000003e syscall=46 success=yes exit=23892 a0=3 a1=7ffed1793b60 a2=0 a3=7ffed1793b4c items=0 ppid=4045 pid=4820 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:42.845000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 16 21:17:42.849876 kubelet[2882]: I0116 21:17:42.849832 2882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-mg6kr" podStartSLOduration=41.849816124 podStartE2EDuration="41.849816124s" podCreationTimestamp="2026-01-16 21:17:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-16 21:17:42.849635301 +0000 UTC m=+47.284574176" watchObservedRunningTime="2026-01-16 21:17:42.849816124 +0000 UTC m=+47.284755004" Jan 16 21:17:42.865100 systemd[1]: Started cri-containerd-d4129898ca66c6d1ad116bff62d7a285defda16bd11d9a2780eafdf6273263d7.scope - libcontainer container d4129898ca66c6d1ad116bff62d7a285defda16bd11d9a2780eafdf6273263d7. Jan 16 21:17:42.921000 audit: BPF prog-id=246 op=LOAD Jan 16 21:17:42.924000 audit: BPF prog-id=247 op=LOAD Jan 16 21:17:42.924000 audit[4825]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a8238 a2=98 a3=0 items=0 ppid=4812 pid=4825 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:42.924000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6434313239383938636136366336643161643131366266663632643761 Jan 16 21:17:42.925000 audit: BPF prog-id=247 op=UNLOAD Jan 16 21:17:42.925000 audit[4825]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4812 pid=4825 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:42.925000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6434313239383938636136366336643161643131366266663632643761 Jan 16 21:17:42.926000 audit: BPF prog-id=248 op=LOAD Jan 16 21:17:42.926000 audit[4825]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a8488 a2=98 a3=0 items=0 ppid=4812 pid=4825 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:42.926000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6434313239383938636136366336643161643131366266663632643761 Jan 16 21:17:42.926000 audit: BPF prog-id=249 op=LOAD Jan 16 21:17:42.926000 audit[4825]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001a8218 a2=98 a3=0 items=0 ppid=4812 pid=4825 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:42.926000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6434313239383938636136366336643161643131366266663632643761 Jan 16 21:17:42.926000 audit: BPF prog-id=249 op=UNLOAD Jan 16 21:17:42.926000 audit[4825]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4812 pid=4825 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:42.926000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6434313239383938636136366336643161643131366266663632643761 Jan 16 21:17:42.926000 audit: BPF prog-id=248 op=UNLOAD Jan 16 21:17:42.926000 audit[4825]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4812 pid=4825 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:42.926000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6434313239383938636136366336643161643131366266663632643761 Jan 16 21:17:42.926000 audit: BPF prog-id=250 op=LOAD Jan 16 21:17:42.926000 audit[4825]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a86e8 a2=98 a3=0 items=0 ppid=4812 pid=4825 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:42.926000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6434313239383938636136366336643161643131366266663632643761 Jan 16 21:17:42.995237 containerd[1678]: time="2026-01-16T21:17:42.995191446Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-sflq7,Uid:88cd36cc-6fbb-4a49-8bfe-3b87f533604b,Namespace:kube-system,Attempt:0,} returns sandbox id \"d4129898ca66c6d1ad116bff62d7a285defda16bd11d9a2780eafdf6273263d7\"" Jan 16 21:17:42.999204 containerd[1678]: time="2026-01-16T21:17:42.999174062Z" level=info msg="CreateContainer within sandbox \"d4129898ca66c6d1ad116bff62d7a285defda16bd11d9a2780eafdf6273263d7\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 16 21:17:43.007000 audit[4851]: NETFILTER_CFG table=filter:134 family=2 entries=20 op=nft_register_rule pid=4851 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 21:17:43.007000 audit[4851]: SYSCALL arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7ffc323ede80 a2=0 a3=7ffc323ede6c items=0 ppid=3028 pid=4851 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:43.007000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 21:17:43.011000 audit[4851]: NETFILTER_CFG table=nat:135 family=2 entries=14 op=nft_register_rule pid=4851 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 21:17:43.011000 audit[4851]: SYSCALL arch=c000003e syscall=46 success=yes exit=3468 a0=3 a1=7ffc323ede80 a2=0 a3=0 items=0 ppid=3028 pid=4851 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:43.011000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 21:17:43.027960 containerd[1678]: time="2026-01-16T21:17:43.027794214Z" level=info msg="Container 6caff057cc1e5fb046642c68d4920dfbb84fe321dd4b982c583062b10d65cfcd: CDI devices from CRI Config.CDIDevices: []" Jan 16 21:17:43.034000 audit[4853]: NETFILTER_CFG table=filter:136 family=2 entries=17 op=nft_register_rule pid=4853 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 21:17:43.034000 audit[4853]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7ffd69c79d00 a2=0 a3=7ffd69c79cec items=0 ppid=3028 pid=4853 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:43.034000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 21:17:43.040000 audit[4853]: NETFILTER_CFG table=nat:137 family=2 entries=35 op=nft_register_chain pid=4853 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 21:17:43.040000 audit[4853]: SYSCALL arch=c000003e syscall=46 success=yes exit=14196 a0=3 a1=7ffd69c79d00 a2=0 a3=7ffd69c79cec items=0 ppid=3028 pid=4853 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:43.040000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 21:17:43.051866 containerd[1678]: time="2026-01-16T21:17:43.051823081Z" level=info msg="CreateContainer within sandbox \"d4129898ca66c6d1ad116bff62d7a285defda16bd11d9a2780eafdf6273263d7\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"6caff057cc1e5fb046642c68d4920dfbb84fe321dd4b982c583062b10d65cfcd\"" Jan 16 21:17:43.052585 containerd[1678]: time="2026-01-16T21:17:43.052553707Z" level=info msg="StartContainer for \"6caff057cc1e5fb046642c68d4920dfbb84fe321dd4b982c583062b10d65cfcd\"" Jan 16 21:17:43.053571 containerd[1678]: time="2026-01-16T21:17:43.053538961Z" level=info msg="connecting to shim 6caff057cc1e5fb046642c68d4920dfbb84fe321dd4b982c583062b10d65cfcd" address="unix:///run/containerd/s/c91d480ee90aa020e68362d75d0db106f6b46fa6f0a67ff2772d19d62a34bce8" protocol=ttrpc version=3 Jan 16 21:17:43.079796 systemd[1]: Started cri-containerd-6caff057cc1e5fb046642c68d4920dfbb84fe321dd4b982c583062b10d65cfcd.scope - libcontainer container 6caff057cc1e5fb046642c68d4920dfbb84fe321dd4b982c583062b10d65cfcd. Jan 16 21:17:43.091000 audit: BPF prog-id=251 op=LOAD Jan 16 21:17:43.091000 audit: BPF prog-id=252 op=LOAD Jan 16 21:17:43.091000 audit[4854]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=4812 pid=4854 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:43.091000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3663616666303537636331653566623034363634326336386434393230 Jan 16 21:17:43.091000 audit: BPF prog-id=252 op=UNLOAD Jan 16 21:17:43.091000 audit[4854]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4812 pid=4854 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:43.091000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3663616666303537636331653566623034363634326336386434393230 Jan 16 21:17:43.092000 audit: BPF prog-id=253 op=LOAD Jan 16 21:17:43.092000 audit[4854]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=4812 pid=4854 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:43.092000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3663616666303537636331653566623034363634326336386434393230 Jan 16 21:17:43.092000 audit: BPF prog-id=254 op=LOAD Jan 16 21:17:43.092000 audit[4854]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=4812 pid=4854 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:43.092000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3663616666303537636331653566623034363634326336386434393230 Jan 16 21:17:43.092000 audit: BPF prog-id=254 op=UNLOAD Jan 16 21:17:43.092000 audit[4854]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4812 pid=4854 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:43.092000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3663616666303537636331653566623034363634326336386434393230 Jan 16 21:17:43.092000 audit: BPF prog-id=253 op=UNLOAD Jan 16 21:17:43.092000 audit[4854]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4812 pid=4854 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:43.092000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3663616666303537636331653566623034363634326336386434393230 Jan 16 21:17:43.092000 audit: BPF prog-id=255 op=LOAD Jan 16 21:17:43.092000 audit[4854]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=4812 pid=4854 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:43.092000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3663616666303537636331653566623034363634326336386434393230 Jan 16 21:17:43.118212 containerd[1678]: time="2026-01-16T21:17:43.118171099Z" level=info msg="StartContainer for \"6caff057cc1e5fb046642c68d4920dfbb84fe321dd4b982c583062b10d65cfcd\" returns successfully" Jan 16 21:17:43.228337 systemd-networkd[1571]: calicaba625dc96: Gained IPv6LL Jan 16 21:17:43.611867 systemd-networkd[1571]: cali5da0d7c6c6d: Gained IPv6LL Jan 16 21:17:43.828806 kubelet[2882]: E0116 21:17:43.828449 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6787d9fcf6-mcbl2" podUID="eb6b7565-0bac-4e11-9918-28c46c9c8c58" Jan 16 21:17:43.829199 kubelet[2882]: E0116 21:17:43.829164 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6787d9fcf6-9lshv" podUID="ce4d12ad-802f-4a94-a69f-0be1648ae09b" Jan 16 21:17:43.861684 kubelet[2882]: I0116 21:17:43.861636 2882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-sflq7" podStartSLOduration=42.861620174 podStartE2EDuration="42.861620174s" podCreationTimestamp="2026-01-16 21:17:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-16 21:17:43.86144587 +0000 UTC m=+48.296384749" watchObservedRunningTime="2026-01-16 21:17:43.861620174 +0000 UTC m=+48.296559058" Jan 16 21:17:43.883000 audit[4889]: NETFILTER_CFG table=filter:138 family=2 entries=14 op=nft_register_rule pid=4889 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 21:17:43.883000 audit[4889]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7ffcc7ec4c10 a2=0 a3=7ffcc7ec4bfc items=0 ppid=3028 pid=4889 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:43.883000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 21:17:43.891000 audit[4889]: NETFILTER_CFG table=nat:139 family=2 entries=44 op=nft_register_rule pid=4889 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 21:17:43.891000 audit[4889]: SYSCALL arch=c000003e syscall=46 success=yes exit=14196 a0=3 a1=7ffcc7ec4c10 a2=0 a3=7ffcc7ec4bfc items=0 ppid=3028 pid=4889 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:43.891000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 21:17:44.059729 systemd-networkd[1571]: calib61b1616a02: Gained IPv6LL Jan 16 21:17:44.764006 systemd-networkd[1571]: cali99b974c08e2: Gained IPv6LL Jan 16 21:17:44.909000 audit[4891]: NETFILTER_CFG table=filter:140 family=2 entries=14 op=nft_register_rule pid=4891 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 21:17:44.909000 audit[4891]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7fff7c65e760 a2=0 a3=7fff7c65e74c items=0 ppid=3028 pid=4891 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:44.909000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 21:17:44.935000 audit[4891]: NETFILTER_CFG table=nat:141 family=2 entries=56 op=nft_register_chain pid=4891 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 21:17:44.935000 audit[4891]: SYSCALL arch=c000003e syscall=46 success=yes exit=19860 a0=3 a1=7fff7c65e760 a2=0 a3=7fff7c65e74c items=0 ppid=3028 pid=4891 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:17:44.935000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 21:17:49.674005 containerd[1678]: time="2026-01-16T21:17:49.673885304Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 16 21:17:50.011581 containerd[1678]: time="2026-01-16T21:17:50.011416403Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 21:17:50.013330 containerd[1678]: time="2026-01-16T21:17:50.013282231Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 16 21:17:50.013524 containerd[1678]: time="2026-01-16T21:17:50.013370465Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 16 21:17:50.013557 kubelet[2882]: E0116 21:17:50.013495 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 16 21:17:50.014005 kubelet[2882]: E0116 21:17:50.013560 2882 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 16 21:17:50.014005 kubelet[2882]: E0116 21:17:50.013693 2882 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:4aba7cb893c2466794c6629c238ba8cf,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bntxj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-6b5c8f6bc4-rwplp_calico-system(96381c87-0aa3-4b10-9efd-84c74923efa3): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 16 21:17:50.017512 containerd[1678]: time="2026-01-16T21:17:50.017471311Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 16 21:17:50.364036 containerd[1678]: time="2026-01-16T21:17:50.363780157Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 21:17:50.365608 containerd[1678]: time="2026-01-16T21:17:50.365467069Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 16 21:17:50.365608 containerd[1678]: time="2026-01-16T21:17:50.365504223Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 16 21:17:50.365879 kubelet[2882]: E0116 21:17:50.365818 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 16 21:17:50.365928 kubelet[2882]: E0116 21:17:50.365877 2882 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 16 21:17:50.366029 kubelet[2882]: E0116 21:17:50.365996 2882 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bntxj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-6b5c8f6bc4-rwplp_calico-system(96381c87-0aa3-4b10-9efd-84c74923efa3): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 16 21:17:50.367281 kubelet[2882]: E0116 21:17:50.367215 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6b5c8f6bc4-rwplp" podUID="96381c87-0aa3-4b10-9efd-84c74923efa3" Jan 16 21:17:53.670825 containerd[1678]: time="2026-01-16T21:17:53.670563723Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 16 21:17:54.010975 containerd[1678]: time="2026-01-16T21:17:54.010848012Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 21:17:54.016652 containerd[1678]: time="2026-01-16T21:17:54.016606084Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 16 21:17:54.016811 containerd[1678]: time="2026-01-16T21:17:54.016625916Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 16 21:17:54.016864 kubelet[2882]: E0116 21:17:54.016811 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 16 21:17:54.017257 kubelet[2882]: E0116 21:17:54.016863 2882 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 16 21:17:54.017257 kubelet[2882]: E0116 21:17:54.017019 2882 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-twtjv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-ns7rw_calico-system(3048fb4b-f634-4014-ae59-9ad3946acc61): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 16 21:17:54.018194 kubelet[2882]: E0116 21:17:54.018170 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-ns7rw" podUID="3048fb4b-f634-4014-ae59-9ad3946acc61" Jan 16 21:17:54.671581 containerd[1678]: time="2026-01-16T21:17:54.671371570Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 16 21:17:55.016976 containerd[1678]: time="2026-01-16T21:17:55.016730698Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 21:17:55.018515 containerd[1678]: time="2026-01-16T21:17:55.018404386Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 16 21:17:55.018515 containerd[1678]: time="2026-01-16T21:17:55.018419719Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 16 21:17:55.018679 kubelet[2882]: E0116 21:17:55.018638 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 16 21:17:55.018951 kubelet[2882]: E0116 21:17:55.018684 2882 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 16 21:17:55.018951 kubelet[2882]: E0116 21:17:55.018784 2882 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xdsvg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-6787d9fcf6-9lshv_calico-apiserver(ce4d12ad-802f-4a94-a69f-0be1648ae09b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 16 21:17:55.021387 kubelet[2882]: E0116 21:17:55.020867 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6787d9fcf6-9lshv" podUID="ce4d12ad-802f-4a94-a69f-0be1648ae09b" Jan 16 21:17:55.671764 containerd[1678]: time="2026-01-16T21:17:55.671434943Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 16 21:17:56.001525 containerd[1678]: time="2026-01-16T21:17:56.001390152Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 21:17:56.003407 containerd[1678]: time="2026-01-16T21:17:56.003337597Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 16 21:17:56.003523 containerd[1678]: time="2026-01-16T21:17:56.003433690Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 16 21:17:56.003710 kubelet[2882]: E0116 21:17:56.003677 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 16 21:17:56.003813 kubelet[2882]: E0116 21:17:56.003718 2882 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 16 21:17:56.003934 kubelet[2882]: E0116 21:17:56.003831 2882 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-btq9w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-56f896845d-kkjxr_calico-system(91564960-b99d-4a80-9373-1c0eb8e68a7f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 16 21:17:56.005056 kubelet[2882]: E0116 21:17:56.004992 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-56f896845d-kkjxr" podUID="91564960-b99d-4a80-9373-1c0eb8e68a7f" Jan 16 21:17:57.670657 containerd[1678]: time="2026-01-16T21:17:57.670389129Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 16 21:17:58.035266 containerd[1678]: time="2026-01-16T21:17:58.035152663Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 21:17:58.037800 containerd[1678]: time="2026-01-16T21:17:58.037702381Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 16 21:17:58.037800 containerd[1678]: time="2026-01-16T21:17:58.037755763Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 16 21:17:58.037936 kubelet[2882]: E0116 21:17:58.037905 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 16 21:17:58.038180 kubelet[2882]: E0116 21:17:58.037946 2882 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 16 21:17:58.038180 kubelet[2882]: E0116 21:17:58.038049 2882 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8wg29,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-kql5z_calico-system(a7efe392-df52-4d70-9d66-17372a93751c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 16 21:17:58.040867 containerd[1678]: time="2026-01-16T21:17:58.040775106Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 16 21:17:58.382857 containerd[1678]: time="2026-01-16T21:17:58.382649132Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 21:17:58.385017 containerd[1678]: time="2026-01-16T21:17:58.384908535Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 16 21:17:58.385017 containerd[1678]: time="2026-01-16T21:17:58.384990153Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 16 21:17:58.385174 kubelet[2882]: E0116 21:17:58.385120 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 16 21:17:58.385215 kubelet[2882]: E0116 21:17:58.385184 2882 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 16 21:17:58.385659 kubelet[2882]: E0116 21:17:58.385312 2882 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8wg29,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-kql5z_calico-system(a7efe392-df52-4d70-9d66-17372a93751c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 16 21:17:58.386454 kubelet[2882]: E0116 21:17:58.386431 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-kql5z" podUID="a7efe392-df52-4d70-9d66-17372a93751c" Jan 16 21:17:58.670874 containerd[1678]: time="2026-01-16T21:17:58.670771488Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 16 21:17:59.010354 containerd[1678]: time="2026-01-16T21:17:59.010082066Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 21:17:59.012370 containerd[1678]: time="2026-01-16T21:17:59.012267732Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 16 21:17:59.012370 containerd[1678]: time="2026-01-16T21:17:59.012317888Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 16 21:17:59.012548 kubelet[2882]: E0116 21:17:59.012458 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 16 21:17:59.012548 kubelet[2882]: E0116 21:17:59.012496 2882 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 16 21:17:59.012957 kubelet[2882]: E0116 21:17:59.012641 2882 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-72sgl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-6787d9fcf6-mcbl2_calico-apiserver(eb6b7565-0bac-4e11-9918-28c46c9c8c58): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 16 21:17:59.014217 kubelet[2882]: E0116 21:17:59.014175 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6787d9fcf6-mcbl2" podUID="eb6b7565-0bac-4e11-9918-28c46c9c8c58" Jan 16 21:18:01.671258 kubelet[2882]: E0116 21:18:01.671212 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6b5c8f6bc4-rwplp" podUID="96381c87-0aa3-4b10-9efd-84c74923efa3" Jan 16 21:18:06.670692 kubelet[2882]: E0116 21:18:06.670075 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6787d9fcf6-9lshv" podUID="ce4d12ad-802f-4a94-a69f-0be1648ae09b" Jan 16 21:18:06.670692 kubelet[2882]: E0116 21:18:06.670133 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-ns7rw" podUID="3048fb4b-f634-4014-ae59-9ad3946acc61" Jan 16 21:18:09.670730 kubelet[2882]: E0116 21:18:09.670695 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-56f896845d-kkjxr" podUID="91564960-b99d-4a80-9373-1c0eb8e68a7f" Jan 16 21:18:12.670611 kubelet[2882]: E0116 21:18:12.670531 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-kql5z" podUID="a7efe392-df52-4d70-9d66-17372a93751c" Jan 16 21:18:13.671878 kubelet[2882]: E0116 21:18:13.671754 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6787d9fcf6-mcbl2" podUID="eb6b7565-0bac-4e11-9918-28c46c9c8c58" Jan 16 21:18:15.671838 containerd[1678]: time="2026-01-16T21:18:15.671715141Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 16 21:18:16.020944 containerd[1678]: time="2026-01-16T21:18:16.020798165Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 21:18:16.023622 containerd[1678]: time="2026-01-16T21:18:16.023047278Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 16 21:18:16.023622 containerd[1678]: time="2026-01-16T21:18:16.023110171Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 16 21:18:16.023754 kubelet[2882]: E0116 21:18:16.023518 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 16 21:18:16.023754 kubelet[2882]: E0116 21:18:16.023571 2882 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 16 21:18:16.023754 kubelet[2882]: E0116 21:18:16.023709 2882 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:4aba7cb893c2466794c6629c238ba8cf,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bntxj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-6b5c8f6bc4-rwplp_calico-system(96381c87-0aa3-4b10-9efd-84c74923efa3): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 16 21:18:16.025759 containerd[1678]: time="2026-01-16T21:18:16.025733466Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 16 21:18:16.355678 containerd[1678]: time="2026-01-16T21:18:16.354730625Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 21:18:16.356710 containerd[1678]: time="2026-01-16T21:18:16.356625089Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 16 21:18:16.356778 containerd[1678]: time="2026-01-16T21:18:16.356696192Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 16 21:18:16.356857 kubelet[2882]: E0116 21:18:16.356826 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 16 21:18:16.356901 kubelet[2882]: E0116 21:18:16.356869 2882 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 16 21:18:16.357523 kubelet[2882]: E0116 21:18:16.357390 2882 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bntxj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-6b5c8f6bc4-rwplp_calico-system(96381c87-0aa3-4b10-9efd-84c74923efa3): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 16 21:18:16.358680 kubelet[2882]: E0116 21:18:16.358636 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6b5c8f6bc4-rwplp" podUID="96381c87-0aa3-4b10-9efd-84c74923efa3" Jan 16 21:18:19.678820 containerd[1678]: time="2026-01-16T21:18:19.678022800Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 16 21:18:20.016415 containerd[1678]: time="2026-01-16T21:18:20.016310127Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 21:18:20.017846 containerd[1678]: time="2026-01-16T21:18:20.017797240Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 16 21:18:20.017921 containerd[1678]: time="2026-01-16T21:18:20.017872304Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 16 21:18:20.018116 kubelet[2882]: E0116 21:18:20.018033 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 16 21:18:20.018116 kubelet[2882]: E0116 21:18:20.018091 2882 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 16 21:18:20.018526 kubelet[2882]: E0116 21:18:20.018199 2882 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xdsvg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-6787d9fcf6-9lshv_calico-apiserver(ce4d12ad-802f-4a94-a69f-0be1648ae09b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 16 21:18:20.019653 kubelet[2882]: E0116 21:18:20.019623 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6787d9fcf6-9lshv" podUID="ce4d12ad-802f-4a94-a69f-0be1648ae09b" Jan 16 21:18:21.671740 containerd[1678]: time="2026-01-16T21:18:21.671080744Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 16 21:18:22.001835 containerd[1678]: time="2026-01-16T21:18:22.001726558Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 21:18:22.003418 containerd[1678]: time="2026-01-16T21:18:22.003372002Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 16 21:18:22.003513 containerd[1678]: time="2026-01-16T21:18:22.003449846Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 16 21:18:22.003607 kubelet[2882]: E0116 21:18:22.003572 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 16 21:18:22.003878 kubelet[2882]: E0116 21:18:22.003627 2882 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 16 21:18:22.003878 kubelet[2882]: E0116 21:18:22.003746 2882 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-twtjv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-ns7rw_calico-system(3048fb4b-f634-4014-ae59-9ad3946acc61): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 16 21:18:22.004951 kubelet[2882]: E0116 21:18:22.004915 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-ns7rw" podUID="3048fb4b-f634-4014-ae59-9ad3946acc61" Jan 16 21:18:22.670814 containerd[1678]: time="2026-01-16T21:18:22.670749453Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 16 21:18:23.009078 containerd[1678]: time="2026-01-16T21:18:23.008845229Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 21:18:23.010522 containerd[1678]: time="2026-01-16T21:18:23.010412900Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 16 21:18:23.010522 containerd[1678]: time="2026-01-16T21:18:23.010498371Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 16 21:18:23.010994 kubelet[2882]: E0116 21:18:23.010757 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 16 21:18:23.010994 kubelet[2882]: E0116 21:18:23.010809 2882 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 16 21:18:23.010994 kubelet[2882]: E0116 21:18:23.010947 2882 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-btq9w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-56f896845d-kkjxr_calico-system(91564960-b99d-4a80-9373-1c0eb8e68a7f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 16 21:18:23.012439 kubelet[2882]: E0116 21:18:23.012408 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-56f896845d-kkjxr" podUID="91564960-b99d-4a80-9373-1c0eb8e68a7f" Jan 16 21:18:27.674658 containerd[1678]: time="2026-01-16T21:18:27.673806247Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 16 21:18:28.006140 containerd[1678]: time="2026-01-16T21:18:28.005989692Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 21:18:28.007767 containerd[1678]: time="2026-01-16T21:18:28.007700164Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 16 21:18:28.008059 containerd[1678]: time="2026-01-16T21:18:28.007797764Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 16 21:18:28.008408 kubelet[2882]: E0116 21:18:28.008274 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 16 21:18:28.009020 kubelet[2882]: E0116 21:18:28.008321 2882 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 16 21:18:28.009020 kubelet[2882]: E0116 21:18:28.008889 2882 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8wg29,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-kql5z_calico-system(a7efe392-df52-4d70-9d66-17372a93751c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 16 21:18:28.011592 containerd[1678]: time="2026-01-16T21:18:28.011570872Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 16 21:18:28.339664 containerd[1678]: time="2026-01-16T21:18:28.339410044Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 21:18:28.341106 containerd[1678]: time="2026-01-16T21:18:28.341019577Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 16 21:18:28.341182 containerd[1678]: time="2026-01-16T21:18:28.341091098Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 16 21:18:28.341249 kubelet[2882]: E0116 21:18:28.341218 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 16 21:18:28.341955 kubelet[2882]: E0116 21:18:28.341259 2882 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 16 21:18:28.341955 kubelet[2882]: E0116 21:18:28.341374 2882 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8wg29,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-kql5z_calico-system(a7efe392-df52-4d70-9d66-17372a93751c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 16 21:18:28.343275 kubelet[2882]: E0116 21:18:28.343237 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-kql5z" podUID="a7efe392-df52-4d70-9d66-17372a93751c" Jan 16 21:18:28.672140 containerd[1678]: time="2026-01-16T21:18:28.671761629Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 16 21:18:29.005569 containerd[1678]: time="2026-01-16T21:18:29.005452014Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 21:18:29.012422 containerd[1678]: time="2026-01-16T21:18:29.012338491Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 16 21:18:29.012625 containerd[1678]: time="2026-01-16T21:18:29.012548600Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 16 21:18:29.012811 kubelet[2882]: E0116 21:18:29.012771 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 16 21:18:29.013053 kubelet[2882]: E0116 21:18:29.012816 2882 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 16 21:18:29.013845 kubelet[2882]: E0116 21:18:29.013235 2882 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-72sgl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-6787d9fcf6-mcbl2_calico-apiserver(eb6b7565-0bac-4e11-9918-28c46c9c8c58): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 16 21:18:29.014426 kubelet[2882]: E0116 21:18:29.014400 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6787d9fcf6-mcbl2" podUID="eb6b7565-0bac-4e11-9918-28c46c9c8c58" Jan 16 21:18:30.671825 kubelet[2882]: E0116 21:18:30.671784 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6b5c8f6bc4-rwplp" podUID="96381c87-0aa3-4b10-9efd-84c74923efa3" Jan 16 21:18:31.671148 kubelet[2882]: E0116 21:18:31.671080 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6787d9fcf6-9lshv" podUID="ce4d12ad-802f-4a94-a69f-0be1648ae09b" Jan 16 21:18:32.670174 kubelet[2882]: E0116 21:18:32.670138 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-ns7rw" podUID="3048fb4b-f634-4014-ae59-9ad3946acc61" Jan 16 21:18:35.672281 kubelet[2882]: E0116 21:18:35.671958 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-56f896845d-kkjxr" podUID="91564960-b99d-4a80-9373-1c0eb8e68a7f" Jan 16 21:18:41.672242 kubelet[2882]: E0116 21:18:41.671964 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6787d9fcf6-mcbl2" podUID="eb6b7565-0bac-4e11-9918-28c46c9c8c58" Jan 16 21:18:41.673985 kubelet[2882]: E0116 21:18:41.673952 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-kql5z" podUID="a7efe392-df52-4d70-9d66-17372a93751c" Jan 16 21:18:42.672009 kubelet[2882]: E0116 21:18:42.671960 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6b5c8f6bc4-rwplp" podUID="96381c87-0aa3-4b10-9efd-84c74923efa3" Jan 16 21:18:45.672859 kubelet[2882]: E0116 21:18:45.672359 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-ns7rw" podUID="3048fb4b-f634-4014-ae59-9ad3946acc61" Jan 16 21:18:46.669758 kubelet[2882]: E0116 21:18:46.669717 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6787d9fcf6-9lshv" podUID="ce4d12ad-802f-4a94-a69f-0be1648ae09b" Jan 16 21:18:48.670074 kubelet[2882]: E0116 21:18:48.670030 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-56f896845d-kkjxr" podUID="91564960-b99d-4a80-9373-1c0eb8e68a7f" Jan 16 21:18:53.671502 kubelet[2882]: E0116 21:18:53.671419 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6b5c8f6bc4-rwplp" podUID="96381c87-0aa3-4b10-9efd-84c74923efa3" Jan 16 21:18:54.672198 kubelet[2882]: E0116 21:18:54.672138 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-kql5z" podUID="a7efe392-df52-4d70-9d66-17372a93751c" Jan 16 21:18:56.669699 kubelet[2882]: E0116 21:18:56.669654 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6787d9fcf6-mcbl2" podUID="eb6b7565-0bac-4e11-9918-28c46c9c8c58" Jan 16 21:18:57.670675 kubelet[2882]: E0116 21:18:57.670310 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-ns7rw" podUID="3048fb4b-f634-4014-ae59-9ad3946acc61" Jan 16 21:18:59.671458 kubelet[2882]: E0116 21:18:59.671127 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6787d9fcf6-9lshv" podUID="ce4d12ad-802f-4a94-a69f-0be1648ae09b" Jan 16 21:19:02.671037 kubelet[2882]: E0116 21:19:02.670976 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-56f896845d-kkjxr" podUID="91564960-b99d-4a80-9373-1c0eb8e68a7f" Jan 16 21:19:06.670354 kubelet[2882]: E0116 21:19:06.670211 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-kql5z" podUID="a7efe392-df52-4d70-9d66-17372a93751c" Jan 16 21:19:08.670400 containerd[1678]: time="2026-01-16T21:19:08.670179153Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 16 21:19:09.020040 containerd[1678]: time="2026-01-16T21:19:09.019810352Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 21:19:09.021667 containerd[1678]: time="2026-01-16T21:19:09.021628505Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 16 21:19:09.021667 containerd[1678]: time="2026-01-16T21:19:09.021691465Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 16 21:19:09.022002 kubelet[2882]: E0116 21:19:09.021935 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 16 21:19:09.022472 kubelet[2882]: E0116 21:19:09.021975 2882 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 16 21:19:09.022472 kubelet[2882]: E0116 21:19:09.022421 2882 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:4aba7cb893c2466794c6629c238ba8cf,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bntxj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-6b5c8f6bc4-rwplp_calico-system(96381c87-0aa3-4b10-9efd-84c74923efa3): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 16 21:19:09.024902 containerd[1678]: time="2026-01-16T21:19:09.024813064Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 16 21:19:09.342383 containerd[1678]: time="2026-01-16T21:19:09.341932161Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 21:19:09.343772 containerd[1678]: time="2026-01-16T21:19:09.343643691Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 16 21:19:09.343772 containerd[1678]: time="2026-01-16T21:19:09.343715721Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 16 21:19:09.344192 kubelet[2882]: E0116 21:19:09.344005 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 16 21:19:09.344192 kubelet[2882]: E0116 21:19:09.344046 2882 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 16 21:19:09.344192 kubelet[2882]: E0116 21:19:09.344150 2882 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bntxj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-6b5c8f6bc4-rwplp_calico-system(96381c87-0aa3-4b10-9efd-84c74923efa3): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 16 21:19:09.345545 kubelet[2882]: E0116 21:19:09.345511 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6b5c8f6bc4-rwplp" podUID="96381c87-0aa3-4b10-9efd-84c74923efa3" Jan 16 21:19:09.436000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.0.3.156:22-4.153.228.146:59548 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:19:09.436656 systemd[1]: Started sshd@9-10.0.3.156:22-4.153.228.146:59548.service - OpenSSH per-connection server daemon (4.153.228.146:59548). Jan 16 21:19:09.437977 kernel: kauditd_printk_skb: 158 callbacks suppressed Jan 16 21:19:09.438015 kernel: audit: type=1130 audit(1768598349.436:746): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.0.3.156:22-4.153.228.146:59548 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:19:10.012619 sshd[5042]: Accepted publickey for core from 4.153.228.146 port 59548 ssh2: RSA SHA256:bS5ZIeZwWVVoAGnX02c/6nLNZbVn3G5kxaKDLP4fvzs Jan 16 21:19:10.018249 kernel: audit: type=1101 audit(1768598350.011:747): pid=5042 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:10.011000 audit[5042]: USER_ACCT pid=5042 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:10.017539 sshd-session[5042]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 21:19:10.015000 audit[5042]: CRED_ACQ pid=5042 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:10.023616 kernel: audit: type=1103 audit(1768598350.015:748): pid=5042 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:10.026616 kernel: audit: type=1006 audit(1768598350.015:749): pid=5042 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=11 res=1 Jan 16 21:19:10.015000 audit[5042]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffcd5862aa0 a2=3 a3=0 items=0 ppid=1 pid=5042 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:19:10.015000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 21:19:10.031797 systemd-logind[1647]: New session 11 of user core. Jan 16 21:19:10.033447 kernel: audit: type=1300 audit(1768598350.015:749): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffcd5862aa0 a2=3 a3=0 items=0 ppid=1 pid=5042 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:19:10.033634 kernel: audit: type=1327 audit(1768598350.015:749): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 21:19:10.036422 systemd[1]: Started session-11.scope - Session 11 of User core. Jan 16 21:19:10.040000 audit[5042]: USER_START pid=5042 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:10.047616 kernel: audit: type=1105 audit(1768598350.040:750): pid=5042 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:10.046000 audit[5046]: CRED_ACQ pid=5046 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:10.052619 kernel: audit: type=1103 audit(1768598350.046:751): pid=5046 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:10.435442 sshd[5046]: Connection closed by 4.153.228.146 port 59548 Jan 16 21:19:10.436377 sshd-session[5042]: pam_unix(sshd:session): session closed for user core Jan 16 21:19:10.436000 audit[5042]: USER_END pid=5042 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:10.441797 systemd[1]: sshd@9-10.0.3.156:22-4.153.228.146:59548.service: Deactivated successfully. Jan 16 21:19:10.443716 kernel: audit: type=1106 audit(1768598350.436:752): pid=5042 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:10.444340 systemd[1]: session-11.scope: Deactivated successfully. Jan 16 21:19:10.437000 audit[5042]: CRED_DISP pid=5042 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:10.446531 systemd-logind[1647]: Session 11 logged out. Waiting for processes to exit. Jan 16 21:19:10.448011 systemd-logind[1647]: Removed session 11. Jan 16 21:19:10.449631 kernel: audit: type=1104 audit(1768598350.437:753): pid=5042 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:10.440000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.0.3.156:22-4.153.228.146:59548 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:19:10.670090 containerd[1678]: time="2026-01-16T21:19:10.670039389Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 16 21:19:11.015059 containerd[1678]: time="2026-01-16T21:19:11.015023836Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 21:19:11.016699 containerd[1678]: time="2026-01-16T21:19:11.016664964Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 16 21:19:11.016758 containerd[1678]: time="2026-01-16T21:19:11.016728168Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 16 21:19:11.016857 kubelet[2882]: E0116 21:19:11.016821 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 16 21:19:11.017102 kubelet[2882]: E0116 21:19:11.016861 2882 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 16 21:19:11.017102 kubelet[2882]: E0116 21:19:11.016963 2882 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-72sgl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-6787d9fcf6-mcbl2_calico-apiserver(eb6b7565-0bac-4e11-9918-28c46c9c8c58): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 16 21:19:11.018465 kubelet[2882]: E0116 21:19:11.018410 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6787d9fcf6-mcbl2" podUID="eb6b7565-0bac-4e11-9918-28c46c9c8c58" Jan 16 21:19:11.670899 containerd[1678]: time="2026-01-16T21:19:11.670581744Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 16 21:19:12.010059 containerd[1678]: time="2026-01-16T21:19:12.009933321Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 21:19:12.011831 containerd[1678]: time="2026-01-16T21:19:12.011583145Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 16 21:19:12.011831 containerd[1678]: time="2026-01-16T21:19:12.011653336Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 16 21:19:12.011931 kubelet[2882]: E0116 21:19:12.011775 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 16 21:19:12.011931 kubelet[2882]: E0116 21:19:12.011857 2882 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 16 21:19:12.012314 kubelet[2882]: E0116 21:19:12.012244 2882 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-twtjv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-ns7rw_calico-system(3048fb4b-f634-4014-ae59-9ad3946acc61): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 16 21:19:12.013436 kubelet[2882]: E0116 21:19:12.013399 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-ns7rw" podUID="3048fb4b-f634-4014-ae59-9ad3946acc61" Jan 16 21:19:13.671622 containerd[1678]: time="2026-01-16T21:19:13.670682414Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 16 21:19:14.008368 containerd[1678]: time="2026-01-16T21:19:14.008259806Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 21:19:14.009916 containerd[1678]: time="2026-01-16T21:19:14.009881020Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 16 21:19:14.009973 containerd[1678]: time="2026-01-16T21:19:14.009958473Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 16 21:19:14.010156 kubelet[2882]: E0116 21:19:14.010113 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 16 21:19:14.010655 kubelet[2882]: E0116 21:19:14.010168 2882 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 16 21:19:14.010655 kubelet[2882]: E0116 21:19:14.010278 2882 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xdsvg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-6787d9fcf6-9lshv_calico-apiserver(ce4d12ad-802f-4a94-a69f-0be1648ae09b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 16 21:19:14.011555 kubelet[2882]: E0116 21:19:14.011519 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6787d9fcf6-9lshv" podUID="ce4d12ad-802f-4a94-a69f-0be1648ae09b" Jan 16 21:19:15.553761 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 16 21:19:15.553849 kernel: audit: type=1130 audit(1768598355.546:755): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.0.3.156:22-4.153.228.146:34456 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:19:15.546000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.0.3.156:22-4.153.228.146:34456 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:19:15.547680 systemd[1]: Started sshd@10-10.0.3.156:22-4.153.228.146:34456.service - OpenSSH per-connection server daemon (4.153.228.146:34456). Jan 16 21:19:16.073000 audit[5060]: USER_ACCT pid=5060 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:16.075374 sshd[5060]: Accepted publickey for core from 4.153.228.146 port 34456 ssh2: RSA SHA256:bS5ZIeZwWVVoAGnX02c/6nLNZbVn3G5kxaKDLP4fvzs Jan 16 21:19:16.079622 kernel: audit: type=1101 audit(1768598356.073:756): pid=5060 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:16.078000 audit[5060]: CRED_ACQ pid=5060 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:16.080952 sshd-session[5060]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 21:19:16.083630 kernel: audit: type=1103 audit(1768598356.078:757): pid=5060 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:16.086622 kernel: audit: type=1006 audit(1768598356.078:758): pid=5060 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=12 res=1 Jan 16 21:19:16.078000 audit[5060]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc77fbcf50 a2=3 a3=0 items=0 ppid=1 pid=5060 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=12 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:19:16.090510 systemd-logind[1647]: New session 12 of user core. Jan 16 21:19:16.092906 kernel: audit: type=1300 audit(1768598356.078:758): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc77fbcf50 a2=3 a3=0 items=0 ppid=1 pid=5060 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=12 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:19:16.092966 kernel: audit: type=1327 audit(1768598356.078:758): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 21:19:16.078000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 21:19:16.097950 systemd[1]: Started session-12.scope - Session 12 of User core. Jan 16 21:19:16.099000 audit[5060]: USER_START pid=5060 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:16.102000 audit[5064]: CRED_ACQ pid=5064 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:16.107463 kernel: audit: type=1105 audit(1768598356.099:759): pid=5060 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:16.107514 kernel: audit: type=1103 audit(1768598356.102:760): pid=5064 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:16.429338 sshd[5064]: Connection closed by 4.153.228.146 port 34456 Jan 16 21:19:16.431397 sshd-session[5060]: pam_unix(sshd:session): session closed for user core Jan 16 21:19:16.431000 audit[5060]: USER_END pid=5060 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:16.437076 systemd[1]: sshd@10-10.0.3.156:22-4.153.228.146:34456.service: Deactivated successfully. Jan 16 21:19:16.431000 audit[5060]: CRED_DISP pid=5060 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:16.440881 kernel: audit: type=1106 audit(1768598356.431:761): pid=5060 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:16.440928 kernel: audit: type=1104 audit(1768598356.431:762): pid=5060 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:16.440682 systemd[1]: session-12.scope: Deactivated successfully. Jan 16 21:19:16.443953 systemd-logind[1647]: Session 12 logged out. Waiting for processes to exit. Jan 16 21:19:16.435000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.0.3.156:22-4.153.228.146:34456 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:19:16.446493 systemd-logind[1647]: Removed session 12. Jan 16 21:19:17.672500 containerd[1678]: time="2026-01-16T21:19:17.671578903Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 16 21:19:18.000997 containerd[1678]: time="2026-01-16T21:19:18.000876205Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 21:19:18.002992 containerd[1678]: time="2026-01-16T21:19:18.002950196Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 16 21:19:18.003065 containerd[1678]: time="2026-01-16T21:19:18.003050543Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 16 21:19:18.003620 kubelet[2882]: E0116 21:19:18.003222 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 16 21:19:18.003620 kubelet[2882]: E0116 21:19:18.003269 2882 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 16 21:19:18.003620 kubelet[2882]: E0116 21:19:18.003387 2882 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-btq9w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-56f896845d-kkjxr_calico-system(91564960-b99d-4a80-9373-1c0eb8e68a7f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 16 21:19:18.005073 kubelet[2882]: E0116 21:19:18.005020 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-56f896845d-kkjxr" podUID="91564960-b99d-4a80-9373-1c0eb8e68a7f" Jan 16 21:19:20.670876 containerd[1678]: time="2026-01-16T21:19:20.670719582Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 16 21:19:20.995863 containerd[1678]: time="2026-01-16T21:19:20.995587707Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 21:19:20.997462 containerd[1678]: time="2026-01-16T21:19:20.997355165Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 16 21:19:20.997462 containerd[1678]: time="2026-01-16T21:19:20.997431498Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 16 21:19:20.998087 kubelet[2882]: E0116 21:19:20.997687 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 16 21:19:20.998087 kubelet[2882]: E0116 21:19:20.997734 2882 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 16 21:19:20.998087 kubelet[2882]: E0116 21:19:20.997838 2882 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8wg29,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-kql5z_calico-system(a7efe392-df52-4d70-9d66-17372a93751c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 16 21:19:21.001207 containerd[1678]: time="2026-01-16T21:19:21.001038216Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 16 21:19:21.344165 containerd[1678]: time="2026-01-16T21:19:21.344040300Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 21:19:21.346196 containerd[1678]: time="2026-01-16T21:19:21.346150702Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 16 21:19:21.346267 containerd[1678]: time="2026-01-16T21:19:21.346229993Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 16 21:19:21.346404 kubelet[2882]: E0116 21:19:21.346369 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 16 21:19:21.346442 kubelet[2882]: E0116 21:19:21.346416 2882 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 16 21:19:21.346553 kubelet[2882]: E0116 21:19:21.346521 2882 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8wg29,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-kql5z_calico-system(a7efe392-df52-4d70-9d66-17372a93751c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 16 21:19:21.347751 kubelet[2882]: E0116 21:19:21.347721 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-kql5z" podUID="a7efe392-df52-4d70-9d66-17372a93751c" Jan 16 21:19:21.535824 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 16 21:19:21.535893 kernel: audit: type=1130 audit(1768598361.533:764): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-10.0.3.156:22-4.153.228.146:34472 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:19:21.533000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-10.0.3.156:22-4.153.228.146:34472 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:19:21.534866 systemd[1]: Started sshd@11-10.0.3.156:22-4.153.228.146:34472.service - OpenSSH per-connection server daemon (4.153.228.146:34472). Jan 16 21:19:22.063000 audit[5078]: USER_ACCT pid=5078 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:22.069492 sshd[5078]: Accepted publickey for core from 4.153.228.146 port 34472 ssh2: RSA SHA256:bS5ZIeZwWVVoAGnX02c/6nLNZbVn3G5kxaKDLP4fvzs Jan 16 21:19:22.069769 kernel: audit: type=1101 audit(1768598362.063:765): pid=5078 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:22.068000 audit[5078]: CRED_ACQ pid=5078 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:22.070720 sshd-session[5078]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 21:19:22.074921 kernel: audit: type=1103 audit(1768598362.068:766): pid=5078 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:22.074977 kernel: audit: type=1006 audit(1768598362.068:767): pid=5078 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=13 res=1 Jan 16 21:19:22.068000 audit[5078]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fffb8e43280 a2=3 a3=0 items=0 ppid=1 pid=5078 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:19:22.079320 kernel: audit: type=1300 audit(1768598362.068:767): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fffb8e43280 a2=3 a3=0 items=0 ppid=1 pid=5078 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:19:22.068000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 21:19:22.083268 kernel: audit: type=1327 audit(1768598362.068:767): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 21:19:22.083660 systemd-logind[1647]: New session 13 of user core. Jan 16 21:19:22.091782 systemd[1]: Started session-13.scope - Session 13 of User core. Jan 16 21:19:22.092000 audit[5078]: USER_START pid=5078 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:22.094000 audit[5082]: CRED_ACQ pid=5082 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:22.100405 kernel: audit: type=1105 audit(1768598362.092:768): pid=5078 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:22.100461 kernel: audit: type=1103 audit(1768598362.094:769): pid=5082 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:22.416647 sshd[5082]: Connection closed by 4.153.228.146 port 34472 Jan 16 21:19:22.417517 sshd-session[5078]: pam_unix(sshd:session): session closed for user core Jan 16 21:19:22.416000 audit[5078]: USER_END pid=5078 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:22.421067 systemd[1]: sshd@11-10.0.3.156:22-4.153.228.146:34472.service: Deactivated successfully. Jan 16 21:19:22.422881 systemd[1]: session-13.scope: Deactivated successfully. Jan 16 21:19:22.423646 kernel: audit: type=1106 audit(1768598362.416:770): pid=5078 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:22.416000 audit[5078]: CRED_DISP pid=5078 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:22.424620 systemd-logind[1647]: Session 13 logged out. Waiting for processes to exit. Jan 16 21:19:22.419000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-10.0.3.156:22-4.153.228.146:34472 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:19:22.428286 systemd-logind[1647]: Removed session 13. Jan 16 21:19:22.428656 kernel: audit: type=1104 audit(1768598362.416:771): pid=5078 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:22.526000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-10.0.3.156:22-4.153.228.146:34480 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:19:22.527853 systemd[1]: Started sshd@12-10.0.3.156:22-4.153.228.146:34480.service - OpenSSH per-connection server daemon (4.153.228.146:34480). Jan 16 21:19:22.671039 kubelet[2882]: E0116 21:19:22.670753 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-ns7rw" podUID="3048fb4b-f634-4014-ae59-9ad3946acc61" Jan 16 21:19:23.056000 audit[5095]: USER_ACCT pid=5095 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:23.058097 sshd[5095]: Accepted publickey for core from 4.153.228.146 port 34480 ssh2: RSA SHA256:bS5ZIeZwWVVoAGnX02c/6nLNZbVn3G5kxaKDLP4fvzs Jan 16 21:19:23.057000 audit[5095]: CRED_ACQ pid=5095 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:23.058000 audit[5095]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffffcf71f20 a2=3 a3=0 items=0 ppid=1 pid=5095 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=14 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:19:23.058000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 21:19:23.060364 sshd-session[5095]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 21:19:23.068355 systemd-logind[1647]: New session 14 of user core. Jan 16 21:19:23.071957 systemd[1]: Started session-14.scope - Session 14 of User core. Jan 16 21:19:23.073000 audit[5095]: USER_START pid=5095 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:23.075000 audit[5099]: CRED_ACQ pid=5099 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:23.445534 sshd[5099]: Connection closed by 4.153.228.146 port 34480 Jan 16 21:19:23.446210 sshd-session[5095]: pam_unix(sshd:session): session closed for user core Jan 16 21:19:23.445000 audit[5095]: USER_END pid=5095 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:23.445000 audit[5095]: CRED_DISP pid=5095 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:23.450274 systemd[1]: sshd@12-10.0.3.156:22-4.153.228.146:34480.service: Deactivated successfully. Jan 16 21:19:23.449000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-10.0.3.156:22-4.153.228.146:34480 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:19:23.453670 systemd[1]: session-14.scope: Deactivated successfully. Jan 16 21:19:23.455029 systemd-logind[1647]: Session 14 logged out. Waiting for processes to exit. Jan 16 21:19:23.457002 systemd-logind[1647]: Removed session 14. Jan 16 21:19:23.552000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-10.0.3.156:22-4.153.228.146:34492 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:19:23.553698 systemd[1]: Started sshd@13-10.0.3.156:22-4.153.228.146:34492.service - OpenSSH per-connection server daemon (4.153.228.146:34492). Jan 16 21:19:23.671327 kubelet[2882]: E0116 21:19:23.671289 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6787d9fcf6-mcbl2" podUID="eb6b7565-0bac-4e11-9918-28c46c9c8c58" Jan 16 21:19:23.672139 kubelet[2882]: E0116 21:19:23.672113 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6b5c8f6bc4-rwplp" podUID="96381c87-0aa3-4b10-9efd-84c74923efa3" Jan 16 21:19:24.084000 audit[5109]: USER_ACCT pid=5109 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:24.086160 sshd[5109]: Accepted publickey for core from 4.153.228.146 port 34492 ssh2: RSA SHA256:bS5ZIeZwWVVoAGnX02c/6nLNZbVn3G5kxaKDLP4fvzs Jan 16 21:19:24.085000 audit[5109]: CRED_ACQ pid=5109 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:24.085000 audit[5109]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe85f88cc0 a2=3 a3=0 items=0 ppid=1 pid=5109 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=15 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:19:24.085000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 21:19:24.087698 sshd-session[5109]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 21:19:24.092648 systemd-logind[1647]: New session 15 of user core. Jan 16 21:19:24.099807 systemd[1]: Started session-15.scope - Session 15 of User core. Jan 16 21:19:24.101000 audit[5109]: USER_START pid=5109 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:24.103000 audit[5113]: CRED_ACQ pid=5113 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:24.436471 sshd[5113]: Connection closed by 4.153.228.146 port 34492 Jan 16 21:19:24.437333 sshd-session[5109]: pam_unix(sshd:session): session closed for user core Jan 16 21:19:24.441000 audit[5109]: USER_END pid=5109 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:24.441000 audit[5109]: CRED_DISP pid=5109 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:24.446452 systemd[1]: sshd@13-10.0.3.156:22-4.153.228.146:34492.service: Deactivated successfully. Jan 16 21:19:24.445000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-10.0.3.156:22-4.153.228.146:34492 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:19:24.450433 systemd[1]: session-15.scope: Deactivated successfully. Jan 16 21:19:24.453113 systemd-logind[1647]: Session 15 logged out. Waiting for processes to exit. Jan 16 21:19:24.455053 systemd-logind[1647]: Removed session 15. Jan 16 21:19:26.670636 kubelet[2882]: E0116 21:19:26.670335 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6787d9fcf6-9lshv" podUID="ce4d12ad-802f-4a94-a69f-0be1648ae09b" Jan 16 21:19:29.554200 kernel: kauditd_printk_skb: 23 callbacks suppressed Jan 16 21:19:29.554323 kernel: audit: type=1130 audit(1768598369.545:791): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.0.3.156:22-4.153.228.146:33484 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:19:29.545000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.0.3.156:22-4.153.228.146:33484 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:19:29.546850 systemd[1]: Started sshd@14-10.0.3.156:22-4.153.228.146:33484.service - OpenSSH per-connection server daemon (4.153.228.146:33484). Jan 16 21:19:30.077000 audit[5129]: USER_ACCT pid=5129 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:30.080492 sshd-session[5129]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 21:19:30.081079 sshd[5129]: Accepted publickey for core from 4.153.228.146 port 33484 ssh2: RSA SHA256:bS5ZIeZwWVVoAGnX02c/6nLNZbVn3G5kxaKDLP4fvzs Jan 16 21:19:30.083745 kernel: audit: type=1101 audit(1768598370.077:792): pid=5129 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:30.078000 audit[5129]: CRED_ACQ pid=5129 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:30.092814 kernel: audit: type=1103 audit(1768598370.078:793): pid=5129 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:30.093242 systemd-logind[1647]: New session 16 of user core. Jan 16 21:19:30.100639 kernel: audit: type=1006 audit(1768598370.078:794): pid=5129 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=16 res=1 Jan 16 21:19:30.100728 kernel: audit: type=1300 audit(1768598370.078:794): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc41829260 a2=3 a3=0 items=0 ppid=1 pid=5129 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:19:30.078000 audit[5129]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc41829260 a2=3 a3=0 items=0 ppid=1 pid=5129 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:19:30.078000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 21:19:30.105608 kernel: audit: type=1327 audit(1768598370.078:794): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 21:19:30.105915 systemd[1]: Started session-16.scope - Session 16 of User core. Jan 16 21:19:30.106000 audit[5129]: USER_START pid=5129 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:30.114706 kernel: audit: type=1105 audit(1768598370.106:795): pid=5129 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:30.112000 audit[5133]: CRED_ACQ pid=5133 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:30.122564 kernel: audit: type=1103 audit(1768598370.112:796): pid=5133 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:30.428001 sshd[5133]: Connection closed by 4.153.228.146 port 33484 Jan 16 21:19:30.428998 sshd-session[5129]: pam_unix(sshd:session): session closed for user core Jan 16 21:19:30.428000 audit[5129]: USER_END pid=5129 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:30.435681 kernel: audit: type=1106 audit(1768598370.428:797): pid=5129 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:30.435764 kernel: audit: type=1104 audit(1768598370.428:798): pid=5129 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:30.428000 audit[5129]: CRED_DISP pid=5129 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:30.437386 systemd[1]: sshd@14-10.0.3.156:22-4.153.228.146:33484.service: Deactivated successfully. Jan 16 21:19:30.439504 systemd[1]: session-16.scope: Deactivated successfully. Jan 16 21:19:30.436000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.0.3.156:22-4.153.228.146:33484 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:19:30.441060 systemd-logind[1647]: Session 16 logged out. Waiting for processes to exit. Jan 16 21:19:30.441846 systemd-logind[1647]: Removed session 16. Jan 16 21:19:32.670612 kubelet[2882]: E0116 21:19:32.670357 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-56f896845d-kkjxr" podUID="91564960-b99d-4a80-9373-1c0eb8e68a7f" Jan 16 21:19:34.673893 kubelet[2882]: E0116 21:19:34.673848 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-kql5z" podUID="a7efe392-df52-4d70-9d66-17372a93751c" Jan 16 21:19:35.538801 systemd[1]: Started sshd@15-10.0.3.156:22-4.153.228.146:58390.service - OpenSSH per-connection server daemon (4.153.228.146:58390). Jan 16 21:19:35.544887 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 16 21:19:35.544979 kernel: audit: type=1130 audit(1768598375.537:800): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-10.0.3.156:22-4.153.228.146:58390 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:19:35.537000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-10.0.3.156:22-4.153.228.146:58390 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:19:35.675640 kubelet[2882]: E0116 21:19:35.674722 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-ns7rw" podUID="3048fb4b-f634-4014-ae59-9ad3946acc61" Jan 16 21:19:35.675640 kubelet[2882]: E0116 21:19:35.675041 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6b5c8f6bc4-rwplp" podUID="96381c87-0aa3-4b10-9efd-84c74923efa3" Jan 16 21:19:36.070000 audit[5147]: USER_ACCT pid=5147 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:36.077090 sshd[5147]: Accepted publickey for core from 4.153.228.146 port 58390 ssh2: RSA SHA256:bS5ZIeZwWVVoAGnX02c/6nLNZbVn3G5kxaKDLP4fvzs Jan 16 21:19:36.077629 kernel: audit: type=1101 audit(1768598376.070:801): pid=5147 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:36.076000 audit[5147]: CRED_ACQ pid=5147 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:36.081616 kernel: audit: type=1103 audit(1768598376.076:802): pid=5147 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:36.082418 sshd-session[5147]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 21:19:36.085620 kernel: audit: type=1006 audit(1768598376.080:803): pid=5147 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=17 res=1 Jan 16 21:19:36.085667 kernel: audit: type=1300 audit(1768598376.080:803): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe8df60d30 a2=3 a3=0 items=0 ppid=1 pid=5147 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:19:36.080000 audit[5147]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe8df60d30 a2=3 a3=0 items=0 ppid=1 pid=5147 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:19:36.080000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 21:19:36.090999 kernel: audit: type=1327 audit(1768598376.080:803): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 21:19:36.098785 systemd-logind[1647]: New session 17 of user core. Jan 16 21:19:36.104262 systemd[1]: Started session-17.scope - Session 17 of User core. Jan 16 21:19:36.107000 audit[5147]: USER_START pid=5147 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:36.114629 kernel: audit: type=1105 audit(1768598376.107:804): pid=5147 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:36.116000 audit[5174]: CRED_ACQ pid=5174 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:36.122617 kernel: audit: type=1103 audit(1768598376.116:805): pid=5174 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:36.446157 sshd[5174]: Connection closed by 4.153.228.146 port 58390 Jan 16 21:19:36.446971 sshd-session[5147]: pam_unix(sshd:session): session closed for user core Jan 16 21:19:36.448000 audit[5147]: USER_END pid=5147 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:36.455621 kernel: audit: type=1106 audit(1768598376.448:806): pid=5147 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:36.457340 systemd-logind[1647]: Session 17 logged out. Waiting for processes to exit. Jan 16 21:19:36.458121 systemd[1]: sshd@15-10.0.3.156:22-4.153.228.146:58390.service: Deactivated successfully. Jan 16 21:19:36.453000 audit[5147]: CRED_DISP pid=5147 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:36.462140 systemd[1]: session-17.scope: Deactivated successfully. Jan 16 21:19:36.463890 kernel: audit: type=1104 audit(1768598376.453:807): pid=5147 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:36.456000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-10.0.3.156:22-4.153.228.146:58390 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:19:36.464825 systemd-logind[1647]: Removed session 17. Jan 16 21:19:37.671296 kubelet[2882]: E0116 21:19:37.670942 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6787d9fcf6-mcbl2" podUID="eb6b7565-0bac-4e11-9918-28c46c9c8c58" Jan 16 21:19:37.671296 kubelet[2882]: E0116 21:19:37.671249 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6787d9fcf6-9lshv" podUID="ce4d12ad-802f-4a94-a69f-0be1648ae09b" Jan 16 21:19:41.557156 systemd[1]: Started sshd@16-10.0.3.156:22-4.153.228.146:58402.service - OpenSSH per-connection server daemon (4.153.228.146:58402). Jan 16 21:19:41.556000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-10.0.3.156:22-4.153.228.146:58402 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:19:41.559733 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 16 21:19:41.559785 kernel: audit: type=1130 audit(1768598381.556:809): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-10.0.3.156:22-4.153.228.146:58402 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:19:42.103914 sshd[5186]: Accepted publickey for core from 4.153.228.146 port 58402 ssh2: RSA SHA256:bS5ZIeZwWVVoAGnX02c/6nLNZbVn3G5kxaKDLP4fvzs Jan 16 21:19:42.103000 audit[5186]: USER_ACCT pid=5186 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:42.112613 kernel: audit: type=1101 audit(1768598382.103:810): pid=5186 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:42.112676 kernel: audit: type=1103 audit(1768598382.109:811): pid=5186 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:42.109000 audit[5186]: CRED_ACQ pid=5186 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:42.111406 sshd-session[5186]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 21:19:42.119638 kernel: audit: type=1006 audit(1768598382.110:812): pid=5186 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=18 res=1 Jan 16 21:19:42.110000 audit[5186]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffcc3faa810 a2=3 a3=0 items=0 ppid=1 pid=5186 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=18 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:19:42.125613 kernel: audit: type=1300 audit(1768598382.110:812): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffcc3faa810 a2=3 a3=0 items=0 ppid=1 pid=5186 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=18 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:19:42.110000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 21:19:42.128655 systemd-logind[1647]: New session 18 of user core. Jan 16 21:19:42.131612 kernel: audit: type=1327 audit(1768598382.110:812): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 21:19:42.134770 systemd[1]: Started session-18.scope - Session 18 of User core. Jan 16 21:19:42.137000 audit[5186]: USER_START pid=5186 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:42.144003 kernel: audit: type=1105 audit(1768598382.137:813): pid=5186 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:42.143000 audit[5190]: CRED_ACQ pid=5190 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:42.149613 kernel: audit: type=1103 audit(1768598382.143:814): pid=5190 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:42.476510 sshd[5190]: Connection closed by 4.153.228.146 port 58402 Jan 16 21:19:42.477126 sshd-session[5186]: pam_unix(sshd:session): session closed for user core Jan 16 21:19:42.477000 audit[5186]: USER_END pid=5186 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:42.483641 kernel: audit: type=1106 audit(1768598382.477:815): pid=5186 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:42.483617 systemd[1]: sshd@16-10.0.3.156:22-4.153.228.146:58402.service: Deactivated successfully. Jan 16 21:19:42.477000 audit[5186]: CRED_DISP pid=5186 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:42.487652 kernel: audit: type=1104 audit(1768598382.477:816): pid=5186 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:42.485190 systemd[1]: session-18.scope: Deactivated successfully. Jan 16 21:19:42.488442 systemd-logind[1647]: Session 18 logged out. Waiting for processes to exit. Jan 16 21:19:42.483000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-10.0.3.156:22-4.153.228.146:58402 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:19:42.490922 systemd-logind[1647]: Removed session 18. Jan 16 21:19:46.670056 kubelet[2882]: E0116 21:19:46.670008 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-56f896845d-kkjxr" podUID="91564960-b99d-4a80-9373-1c0eb8e68a7f" Jan 16 21:19:47.586000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-10.0.3.156:22-4.153.228.146:52288 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:19:47.587097 systemd[1]: Started sshd@17-10.0.3.156:22-4.153.228.146:52288.service - OpenSSH per-connection server daemon (4.153.228.146:52288). Jan 16 21:19:47.588909 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 16 21:19:47.588947 kernel: audit: type=1130 audit(1768598387.586:818): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-10.0.3.156:22-4.153.228.146:52288 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:19:47.674015 kubelet[2882]: E0116 21:19:47.673980 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-kql5z" podUID="a7efe392-df52-4d70-9d66-17372a93751c" Jan 16 21:19:48.127000 audit[5202]: USER_ACCT pid=5202 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:48.128818 sshd[5202]: Accepted publickey for core from 4.153.228.146 port 52288 ssh2: RSA SHA256:bS5ZIeZwWVVoAGnX02c/6nLNZbVn3G5kxaKDLP4fvzs Jan 16 21:19:48.132636 kernel: audit: type=1101 audit(1768598388.127:819): pid=5202 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:48.133741 sshd-session[5202]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 21:19:48.132000 audit[5202]: CRED_ACQ pid=5202 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:48.138626 kernel: audit: type=1103 audit(1768598388.132:820): pid=5202 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:48.132000 audit[5202]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffebd133300 a2=3 a3=0 items=0 ppid=1 pid=5202 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=19 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:19:48.144862 kernel: audit: type=1006 audit(1768598388.132:821): pid=5202 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=19 res=1 Jan 16 21:19:48.145031 kernel: audit: type=1300 audit(1768598388.132:821): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffebd133300 a2=3 a3=0 items=0 ppid=1 pid=5202 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=19 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:19:48.132000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 21:19:48.148190 kernel: audit: type=1327 audit(1768598388.132:821): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 21:19:48.147936 systemd-logind[1647]: New session 19 of user core. Jan 16 21:19:48.158899 systemd[1]: Started session-19.scope - Session 19 of User core. Jan 16 21:19:48.162000 audit[5202]: USER_START pid=5202 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:48.169766 kernel: audit: type=1105 audit(1768598388.162:822): pid=5202 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:48.169000 audit[5206]: CRED_ACQ pid=5206 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:48.174632 kernel: audit: type=1103 audit(1768598388.169:823): pid=5206 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:48.498187 sshd[5206]: Connection closed by 4.153.228.146 port 52288 Jan 16 21:19:48.499433 sshd-session[5202]: pam_unix(sshd:session): session closed for user core Jan 16 21:19:48.500000 audit[5202]: USER_END pid=5202 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:48.504350 systemd[1]: sshd@17-10.0.3.156:22-4.153.228.146:52288.service: Deactivated successfully. Jan 16 21:19:48.507428 systemd[1]: session-19.scope: Deactivated successfully. Jan 16 21:19:48.507617 kernel: audit: type=1106 audit(1768598388.500:824): pid=5202 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:48.500000 audit[5202]: CRED_DISP pid=5202 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:48.513277 systemd-logind[1647]: Session 19 logged out. Waiting for processes to exit. Jan 16 21:19:48.513623 kernel: audit: type=1104 audit(1768598388.500:825): pid=5202 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:48.515328 systemd-logind[1647]: Removed session 19. Jan 16 21:19:48.503000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-10.0.3.156:22-4.153.228.146:52288 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:19:48.606981 systemd[1]: Started sshd@18-10.0.3.156:22-4.153.228.146:52300.service - OpenSSH per-connection server daemon (4.153.228.146:52300). Jan 16 21:19:48.606000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-10.0.3.156:22-4.153.228.146:52300 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:19:48.671739 kubelet[2882]: E0116 21:19:48.671699 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-ns7rw" podUID="3048fb4b-f634-4014-ae59-9ad3946acc61" Jan 16 21:19:49.150000 audit[5217]: USER_ACCT pid=5217 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:49.151072 sshd[5217]: Accepted publickey for core from 4.153.228.146 port 52300 ssh2: RSA SHA256:bS5ZIeZwWVVoAGnX02c/6nLNZbVn3G5kxaKDLP4fvzs Jan 16 21:19:49.152000 audit[5217]: CRED_ACQ pid=5217 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:49.152000 audit[5217]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffedf671b90 a2=3 a3=0 items=0 ppid=1 pid=5217 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=20 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:19:49.152000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 21:19:49.153967 sshd-session[5217]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 21:19:49.162951 systemd-logind[1647]: New session 20 of user core. Jan 16 21:19:49.166825 systemd[1]: Started session-20.scope - Session 20 of User core. Jan 16 21:19:49.176000 audit[5217]: USER_START pid=5217 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:49.184000 audit[5221]: CRED_ACQ pid=5221 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:49.865247 sshd[5221]: Connection closed by 4.153.228.146 port 52300 Jan 16 21:19:49.865066 sshd-session[5217]: pam_unix(sshd:session): session closed for user core Jan 16 21:19:49.866000 audit[5217]: USER_END pid=5217 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:49.867000 audit[5217]: CRED_DISP pid=5217 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:49.871240 systemd[1]: sshd@18-10.0.3.156:22-4.153.228.146:52300.service: Deactivated successfully. Jan 16 21:19:49.871000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-10.0.3.156:22-4.153.228.146:52300 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:19:49.873992 systemd[1]: session-20.scope: Deactivated successfully. Jan 16 21:19:49.875740 systemd-logind[1647]: Session 20 logged out. Waiting for processes to exit. Jan 16 21:19:49.877376 systemd-logind[1647]: Removed session 20. Jan 16 21:19:49.977922 systemd[1]: Started sshd@19-10.0.3.156:22-4.153.228.146:52314.service - OpenSSH per-connection server daemon (4.153.228.146:52314). Jan 16 21:19:49.977000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-10.0.3.156:22-4.153.228.146:52314 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:19:50.530000 audit[5230]: USER_ACCT pid=5230 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:50.531341 sshd[5230]: Accepted publickey for core from 4.153.228.146 port 52314 ssh2: RSA SHA256:bS5ZIeZwWVVoAGnX02c/6nLNZbVn3G5kxaKDLP4fvzs Jan 16 21:19:50.531000 audit[5230]: CRED_ACQ pid=5230 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:50.531000 audit[5230]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffddd31be10 a2=3 a3=0 items=0 ppid=1 pid=5230 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=21 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:19:50.531000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 21:19:50.533023 sshd-session[5230]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 21:19:50.538167 systemd-logind[1647]: New session 21 of user core. Jan 16 21:19:50.544978 systemd[1]: Started session-21.scope - Session 21 of User core. Jan 16 21:19:50.547000 audit[5230]: USER_START pid=5230 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:50.550000 audit[5234]: CRED_ACQ pid=5234 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:50.670643 kubelet[2882]: E0116 21:19:50.670574 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6b5c8f6bc4-rwplp" podUID="96381c87-0aa3-4b10-9efd-84c74923efa3" Jan 16 21:19:51.277000 audit[5244]: NETFILTER_CFG table=filter:142 family=2 entries=14 op=nft_register_rule pid=5244 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 21:19:51.277000 audit[5244]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7fff9a277930 a2=0 a3=7fff9a27791c items=0 ppid=3028 pid=5244 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:19:51.277000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 21:19:51.281000 audit[5244]: NETFILTER_CFG table=nat:143 family=2 entries=20 op=nft_register_rule pid=5244 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 21:19:51.281000 audit[5244]: SYSCALL arch=c000003e syscall=46 success=yes exit=5772 a0=3 a1=7fff9a277930 a2=0 a3=7fff9a27791c items=0 ppid=3028 pid=5244 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:19:51.281000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 21:19:51.303000 audit[5246]: NETFILTER_CFG table=filter:144 family=2 entries=26 op=nft_register_rule pid=5246 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 21:19:51.303000 audit[5246]: SYSCALL arch=c000003e syscall=46 success=yes exit=14176 a0=3 a1=7fff84f5eda0 a2=0 a3=7fff84f5ed8c items=0 ppid=3028 pid=5246 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:19:51.303000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 21:19:51.306000 audit[5246]: NETFILTER_CFG table=nat:145 family=2 entries=20 op=nft_register_rule pid=5246 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 21:19:51.306000 audit[5246]: SYSCALL arch=c000003e syscall=46 success=yes exit=5772 a0=3 a1=7fff84f5eda0 a2=0 a3=0 items=0 ppid=3028 pid=5246 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:19:51.306000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 21:19:51.341617 sshd[5234]: Connection closed by 4.153.228.146 port 52314 Jan 16 21:19:51.341584 sshd-session[5230]: pam_unix(sshd:session): session closed for user core Jan 16 21:19:51.346000 audit[5230]: USER_END pid=5230 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:51.346000 audit[5230]: CRED_DISP pid=5230 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:51.348956 systemd[1]: sshd@19-10.0.3.156:22-4.153.228.146:52314.service: Deactivated successfully. Jan 16 21:19:51.348000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-10.0.3.156:22-4.153.228.146:52314 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:19:51.350960 systemd[1]: session-21.scope: Deactivated successfully. Jan 16 21:19:51.352276 systemd-logind[1647]: Session 21 logged out. Waiting for processes to exit. Jan 16 21:19:51.352933 systemd-logind[1647]: Removed session 21. Jan 16 21:19:51.452000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.0.3.156:22-4.153.228.146:52324 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:19:51.452665 systemd[1]: Started sshd@20-10.0.3.156:22-4.153.228.146:52324.service - OpenSSH per-connection server daemon (4.153.228.146:52324). Jan 16 21:19:51.672117 kubelet[2882]: E0116 21:19:51.672085 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6787d9fcf6-mcbl2" podUID="eb6b7565-0bac-4e11-9918-28c46c9c8c58" Jan 16 21:19:51.672516 kubelet[2882]: E0116 21:19:51.672445 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6787d9fcf6-9lshv" podUID="ce4d12ad-802f-4a94-a69f-0be1648ae09b" Jan 16 21:19:52.015000 audit[5251]: USER_ACCT pid=5251 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:52.016842 sshd[5251]: Accepted publickey for core from 4.153.228.146 port 52324 ssh2: RSA SHA256:bS5ZIeZwWVVoAGnX02c/6nLNZbVn3G5kxaKDLP4fvzs Jan 16 21:19:52.017000 audit[5251]: CRED_ACQ pid=5251 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:52.017000 audit[5251]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff7c035260 a2=3 a3=0 items=0 ppid=1 pid=5251 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=22 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:19:52.017000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 21:19:52.018932 sshd-session[5251]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 21:19:52.024820 systemd-logind[1647]: New session 22 of user core. Jan 16 21:19:52.029766 systemd[1]: Started session-22.scope - Session 22 of User core. Jan 16 21:19:52.032000 audit[5251]: USER_START pid=5251 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:52.034000 audit[5255]: CRED_ACQ pid=5255 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:52.474029 sshd[5255]: Connection closed by 4.153.228.146 port 52324 Jan 16 21:19:52.474339 sshd-session[5251]: pam_unix(sshd:session): session closed for user core Jan 16 21:19:52.475000 audit[5251]: USER_END pid=5251 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:52.475000 audit[5251]: CRED_DISP pid=5251 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:52.478386 systemd-logind[1647]: Session 22 logged out. Waiting for processes to exit. Jan 16 21:19:52.478676 systemd[1]: sshd@20-10.0.3.156:22-4.153.228.146:52324.service: Deactivated successfully. Jan 16 21:19:52.478000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.0.3.156:22-4.153.228.146:52324 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:19:52.480655 systemd[1]: session-22.scope: Deactivated successfully. Jan 16 21:19:52.482322 systemd-logind[1647]: Removed session 22. Jan 16 21:19:52.584054 systemd[1]: Started sshd@21-10.0.3.156:22-4.153.228.146:52330.service - OpenSSH per-connection server daemon (4.153.228.146:52330). Jan 16 21:19:52.583000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-10.0.3.156:22-4.153.228.146:52330 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:19:53.128516 kernel: kauditd_printk_skb: 47 callbacks suppressed Jan 16 21:19:53.128660 kernel: audit: type=1101 audit(1768598393.125:859): pid=5265 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:53.125000 audit[5265]: USER_ACCT pid=5265 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:53.128087 sshd-session[5265]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 21:19:53.128984 sshd[5265]: Accepted publickey for core from 4.153.228.146 port 52330 ssh2: RSA SHA256:bS5ZIeZwWVVoAGnX02c/6nLNZbVn3G5kxaKDLP4fvzs Jan 16 21:19:53.126000 audit[5265]: CRED_ACQ pid=5265 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:53.133532 kernel: audit: type=1103 audit(1768598393.126:860): pid=5265 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:53.126000 audit[5265]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe7e3f9950 a2=3 a3=0 items=0 ppid=1 pid=5265 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=23 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:19:53.140366 systemd-logind[1647]: New session 23 of user core. Jan 16 21:19:53.142080 kernel: audit: type=1006 audit(1768598393.126:861): pid=5265 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=23 res=1 Jan 16 21:19:53.142146 kernel: audit: type=1300 audit(1768598393.126:861): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe7e3f9950 a2=3 a3=0 items=0 ppid=1 pid=5265 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=23 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:19:53.145189 kernel: audit: type=1327 audit(1768598393.126:861): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 21:19:53.126000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 21:19:53.144785 systemd[1]: Started session-23.scope - Session 23 of User core. Jan 16 21:19:53.148000 audit[5265]: USER_START pid=5265 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:53.150000 audit[5269]: CRED_ACQ pid=5269 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:53.154802 kernel: audit: type=1105 audit(1768598393.148:862): pid=5265 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:53.154839 kernel: audit: type=1103 audit(1768598393.150:863): pid=5269 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:53.493032 sshd[5269]: Connection closed by 4.153.228.146 port 52330 Jan 16 21:19:53.494758 sshd-session[5265]: pam_unix(sshd:session): session closed for user core Jan 16 21:19:53.497000 audit[5265]: USER_END pid=5265 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:53.501019 systemd[1]: sshd@21-10.0.3.156:22-4.153.228.146:52330.service: Deactivated successfully. Jan 16 21:19:53.504836 kernel: audit: type=1106 audit(1768598393.497:864): pid=5265 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:53.504396 systemd[1]: session-23.scope: Deactivated successfully. Jan 16 21:19:53.497000 audit[5265]: CRED_DISP pid=5265 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:53.509074 systemd-logind[1647]: Session 23 logged out. Waiting for processes to exit. Jan 16 21:19:53.510131 kernel: audit: type=1104 audit(1768598393.497:865): pid=5265 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:53.510519 systemd-logind[1647]: Removed session 23. Jan 16 21:19:53.499000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-10.0.3.156:22-4.153.228.146:52330 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:19:53.514637 kernel: audit: type=1131 audit(1768598393.499:866): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-10.0.3.156:22-4.153.228.146:52330 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:19:56.217000 audit[5282]: NETFILTER_CFG table=filter:146 family=2 entries=26 op=nft_register_rule pid=5282 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 21:19:56.217000 audit[5282]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7fffe6f75e70 a2=0 a3=7fffe6f75e5c items=0 ppid=3028 pid=5282 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:19:56.217000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 21:19:56.222000 audit[5282]: NETFILTER_CFG table=nat:147 family=2 entries=104 op=nft_register_chain pid=5282 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 21:19:56.222000 audit[5282]: SYSCALL arch=c000003e syscall=46 success=yes exit=48684 a0=3 a1=7fffe6f75e70 a2=0 a3=7fffe6f75e5c items=0 ppid=3028 pid=5282 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:19:56.222000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 21:19:58.601351 kernel: kauditd_printk_skb: 6 callbacks suppressed Jan 16 21:19:58.601444 kernel: audit: type=1130 audit(1768598398.599:869): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-10.0.3.156:22-4.153.228.146:44410 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:19:58.599000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-10.0.3.156:22-4.153.228.146:44410 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:19:58.600453 systemd[1]: Started sshd@22-10.0.3.156:22-4.153.228.146:44410.service - OpenSSH per-connection server daemon (4.153.228.146:44410). Jan 16 21:19:59.133000 audit[5284]: USER_ACCT pid=5284 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:59.135239 sshd[5284]: Accepted publickey for core from 4.153.228.146 port 44410 ssh2: RSA SHA256:bS5ZIeZwWVVoAGnX02c/6nLNZbVn3G5kxaKDLP4fvzs Jan 16 21:19:59.136959 sshd-session[5284]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 21:19:59.135000 audit[5284]: CRED_ACQ pid=5284 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:59.140658 kernel: audit: type=1101 audit(1768598399.133:870): pid=5284 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:59.140717 kernel: audit: type=1103 audit(1768598399.135:871): pid=5284 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:59.144271 kernel: audit: type=1006 audit(1768598399.135:872): pid=5284 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=24 res=1 Jan 16 21:19:59.145357 systemd-logind[1647]: New session 24 of user core. Jan 16 21:19:59.135000 audit[5284]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffec91eabb0 a2=3 a3=0 items=0 ppid=1 pid=5284 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=24 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:19:59.135000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 21:19:59.150894 kernel: audit: type=1300 audit(1768598399.135:872): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffec91eabb0 a2=3 a3=0 items=0 ppid=1 pid=5284 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=24 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:19:59.150943 kernel: audit: type=1327 audit(1768598399.135:872): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 21:19:59.153841 systemd[1]: Started session-24.scope - Session 24 of User core. Jan 16 21:19:59.158000 audit[5284]: USER_START pid=5284 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:59.163000 audit[5288]: CRED_ACQ pid=5288 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:59.165819 kernel: audit: type=1105 audit(1768598399.158:873): pid=5284 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:59.165873 kernel: audit: type=1103 audit(1768598399.163:874): pid=5288 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:59.481073 sshd[5288]: Connection closed by 4.153.228.146 port 44410 Jan 16 21:19:59.481659 sshd-session[5284]: pam_unix(sshd:session): session closed for user core Jan 16 21:19:59.483000 audit[5284]: USER_END pid=5284 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:59.487307 systemd[1]: sshd@22-10.0.3.156:22-4.153.228.146:44410.service: Deactivated successfully. Jan 16 21:19:59.492730 kernel: audit: type=1106 audit(1768598399.483:875): pid=5284 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:59.492789 kernel: audit: type=1104 audit(1768598399.483:876): pid=5284 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:59.483000 audit[5284]: CRED_DISP pid=5284 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:19:59.489674 systemd[1]: session-24.scope: Deactivated successfully. Jan 16 21:19:59.490540 systemd-logind[1647]: Session 24 logged out. Waiting for processes to exit. Jan 16 21:19:59.485000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-10.0.3.156:22-4.153.228.146:44410 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:19:59.493088 systemd-logind[1647]: Removed session 24. Jan 16 21:19:59.672183 kubelet[2882]: E0116 21:19:59.672134 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-56f896845d-kkjxr" podUID="91564960-b99d-4a80-9373-1c0eb8e68a7f" Jan 16 21:19:59.674800 kubelet[2882]: E0116 21:19:59.674748 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-kql5z" podUID="a7efe392-df52-4d70-9d66-17372a93751c" Jan 16 21:20:03.671585 kubelet[2882]: E0116 21:20:03.671532 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-ns7rw" podUID="3048fb4b-f634-4014-ae59-9ad3946acc61" Jan 16 21:20:03.672951 kubelet[2882]: E0116 21:20:03.671858 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6b5c8f6bc4-rwplp" podUID="96381c87-0aa3-4b10-9efd-84c74923efa3" Jan 16 21:20:04.595546 systemd[1]: Started sshd@23-10.0.3.156:22-4.153.228.146:47008.service - OpenSSH per-connection server daemon (4.153.228.146:47008). Jan 16 21:20:04.595000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-10.0.3.156:22-4.153.228.146:47008 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:20:04.597802 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 16 21:20:04.597851 kernel: audit: type=1130 audit(1768598404.595:878): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-10.0.3.156:22-4.153.228.146:47008 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:20:04.670055 kubelet[2882]: E0116 21:20:04.670021 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6787d9fcf6-9lshv" podUID="ce4d12ad-802f-4a94-a69f-0be1648ae09b" Jan 16 21:20:05.150000 audit[5302]: USER_ACCT pid=5302 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:20:05.152765 sshd[5302]: Accepted publickey for core from 4.153.228.146 port 47008 ssh2: RSA SHA256:bS5ZIeZwWVVoAGnX02c/6nLNZbVn3G5kxaKDLP4fvzs Jan 16 21:20:05.155622 kernel: audit: type=1101 audit(1768598405.150:879): pid=5302 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:20:05.156337 sshd-session[5302]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 21:20:05.154000 audit[5302]: CRED_ACQ pid=5302 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:20:05.160699 kernel: audit: type=1103 audit(1768598405.154:880): pid=5302 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:20:05.160753 kernel: audit: type=1006 audit(1768598405.154:881): pid=5302 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=25 res=1 Jan 16 21:20:05.154000 audit[5302]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc10dd2800 a2=3 a3=0 items=0 ppid=1 pid=5302 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=25 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:20:05.154000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 21:20:05.169384 kernel: audit: type=1300 audit(1768598405.154:881): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc10dd2800 a2=3 a3=0 items=0 ppid=1 pid=5302 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=25 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:20:05.169450 kernel: audit: type=1327 audit(1768598405.154:881): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 21:20:05.169686 systemd-logind[1647]: New session 25 of user core. Jan 16 21:20:05.174664 systemd[1]: Started session-25.scope - Session 25 of User core. Jan 16 21:20:05.177000 audit[5302]: USER_START pid=5302 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:20:05.183635 kernel: audit: type=1105 audit(1768598405.177:882): pid=5302 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:20:05.177000 audit[5306]: CRED_ACQ pid=5306 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:20:05.189615 kernel: audit: type=1103 audit(1768598405.177:883): pid=5306 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:20:05.509188 sshd[5306]: Connection closed by 4.153.228.146 port 47008 Jan 16 21:20:05.509050 sshd-session[5302]: pam_unix(sshd:session): session closed for user core Jan 16 21:20:05.511000 audit[5302]: USER_END pid=5302 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:20:05.511000 audit[5302]: CRED_DISP pid=5302 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:20:05.518848 kernel: audit: type=1106 audit(1768598405.511:884): pid=5302 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:20:05.518902 kernel: audit: type=1104 audit(1768598405.511:885): pid=5302 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:20:05.518165 systemd[1]: sshd@23-10.0.3.156:22-4.153.228.146:47008.service: Deactivated successfully. Jan 16 21:20:05.518000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-10.0.3.156:22-4.153.228.146:47008 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:20:05.523070 systemd[1]: session-25.scope: Deactivated successfully. Jan 16 21:20:05.524428 systemd-logind[1647]: Session 25 logged out. Waiting for processes to exit. Jan 16 21:20:05.525620 systemd-logind[1647]: Removed session 25. Jan 16 21:20:06.671522 kubelet[2882]: E0116 21:20:06.671476 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6787d9fcf6-mcbl2" podUID="eb6b7565-0bac-4e11-9918-28c46c9c8c58" Jan 16 21:20:10.617570 systemd[1]: Started sshd@24-10.0.3.156:22-4.153.228.146:47020.service - OpenSSH per-connection server daemon (4.153.228.146:47020). Jan 16 21:20:10.617000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-10.0.3.156:22-4.153.228.146:47020 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:20:10.619286 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 16 21:20:10.619346 kernel: audit: type=1130 audit(1768598410.617:887): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-10.0.3.156:22-4.153.228.146:47020 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:20:11.165755 sshd[5342]: Accepted publickey for core from 4.153.228.146 port 47020 ssh2: RSA SHA256:bS5ZIeZwWVVoAGnX02c/6nLNZbVn3G5kxaKDLP4fvzs Jan 16 21:20:11.165000 audit[5342]: USER_ACCT pid=5342 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:20:11.170624 kernel: audit: type=1101 audit(1768598411.165:888): pid=5342 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:20:11.172137 sshd-session[5342]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 21:20:11.170000 audit[5342]: CRED_ACQ pid=5342 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:20:11.177658 kernel: audit: type=1103 audit(1768598411.170:889): pid=5342 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:20:11.177724 kernel: audit: type=1006 audit(1768598411.170:890): pid=5342 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=26 res=1 Jan 16 21:20:11.170000 audit[5342]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd0288fa90 a2=3 a3=0 items=0 ppid=1 pid=5342 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=26 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:20:11.181391 kernel: audit: type=1300 audit(1768598411.170:890): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd0288fa90 a2=3 a3=0 items=0 ppid=1 pid=5342 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=26 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:20:11.183472 systemd-logind[1647]: New session 26 of user core. Jan 16 21:20:11.170000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 21:20:11.187634 kernel: audit: type=1327 audit(1768598411.170:890): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 21:20:11.188811 systemd[1]: Started session-26.scope - Session 26 of User core. Jan 16 21:20:11.191000 audit[5342]: USER_START pid=5342 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:20:11.196000 audit[5346]: CRED_ACQ pid=5346 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:20:11.199611 kernel: audit: type=1105 audit(1768598411.191:891): pid=5342 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:20:11.199675 kernel: audit: type=1103 audit(1768598411.196:892): pid=5346 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:20:11.531694 sshd[5346]: Connection closed by 4.153.228.146 port 47020 Jan 16 21:20:11.532133 sshd-session[5342]: pam_unix(sshd:session): session closed for user core Jan 16 21:20:11.533000 audit[5342]: USER_END pid=5342 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:20:11.539627 kernel: audit: type=1106 audit(1768598411.533:893): pid=5342 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:20:11.539757 systemd[1]: sshd@24-10.0.3.156:22-4.153.228.146:47020.service: Deactivated successfully. Jan 16 21:20:11.533000 audit[5342]: CRED_DISP pid=5342 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:20:11.541498 systemd[1]: session-26.scope: Deactivated successfully. Jan 16 21:20:11.545000 kernel: audit: type=1104 audit(1768598411.533:894): pid=5342 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:20:11.545209 systemd-logind[1647]: Session 26 logged out. Waiting for processes to exit. Jan 16 21:20:11.539000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-10.0.3.156:22-4.153.228.146:47020 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:20:11.548446 systemd-logind[1647]: Removed session 26. Jan 16 21:20:13.671221 kubelet[2882]: E0116 21:20:13.671017 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-56f896845d-kkjxr" podUID="91564960-b99d-4a80-9373-1c0eb8e68a7f" Jan 16 21:20:13.672045 kubelet[2882]: E0116 21:20:13.671699 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-kql5z" podUID="a7efe392-df52-4d70-9d66-17372a93751c" Jan 16 21:20:14.670728 kubelet[2882]: E0116 21:20:14.670675 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6b5c8f6bc4-rwplp" podUID="96381c87-0aa3-4b10-9efd-84c74923efa3" Jan 16 21:20:15.669922 kubelet[2882]: E0116 21:20:15.669887 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-ns7rw" podUID="3048fb4b-f634-4014-ae59-9ad3946acc61" Jan 16 21:20:16.642169 systemd[1]: Started sshd@25-10.0.3.156:22-4.153.228.146:39918.service - OpenSSH per-connection server daemon (4.153.228.146:39918). Jan 16 21:20:16.643309 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 16 21:20:16.643347 kernel: audit: type=1130 audit(1768598416.641:896): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@25-10.0.3.156:22-4.153.228.146:39918 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:20:16.641000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@25-10.0.3.156:22-4.153.228.146:39918 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:20:17.172000 audit[5363]: USER_ACCT pid=5363 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:20:17.176754 sshd[5363]: Accepted publickey for core from 4.153.228.146 port 39918 ssh2: RSA SHA256:bS5ZIeZwWVVoAGnX02c/6nLNZbVn3G5kxaKDLP4fvzs Jan 16 21:20:17.178733 sshd-session[5363]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 21:20:17.177000 audit[5363]: CRED_ACQ pid=5363 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:20:17.180797 kernel: audit: type=1101 audit(1768598417.172:897): pid=5363 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:20:17.180851 kernel: audit: type=1103 audit(1768598417.177:898): pid=5363 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:20:17.184785 kernel: audit: type=1006 audit(1768598417.177:899): pid=5363 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=27 res=1 Jan 16 21:20:17.177000 audit[5363]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffce6505c50 a2=3 a3=0 items=0 ppid=1 pid=5363 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=27 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:20:17.188358 kernel: audit: type=1300 audit(1768598417.177:899): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffce6505c50 a2=3 a3=0 items=0 ppid=1 pid=5363 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=27 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:20:17.177000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 21:20:17.192710 kernel: audit: type=1327 audit(1768598417.177:899): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 21:20:17.193042 systemd-logind[1647]: New session 27 of user core. Jan 16 21:20:17.200850 systemd[1]: Started session-27.scope - Session 27 of User core. Jan 16 21:20:17.202000 audit[5363]: USER_START pid=5363 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:20:17.204000 audit[5367]: CRED_ACQ pid=5367 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:20:17.210184 kernel: audit: type=1105 audit(1768598417.202:900): pid=5363 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:20:17.210324 kernel: audit: type=1103 audit(1768598417.204:901): pid=5367 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:20:17.552683 sshd[5367]: Connection closed by 4.153.228.146 port 39918 Jan 16 21:20:17.552806 sshd-session[5363]: pam_unix(sshd:session): session closed for user core Jan 16 21:20:17.553000 audit[5363]: USER_END pid=5363 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:20:17.560632 kernel: audit: type=1106 audit(1768598417.553:902): pid=5363 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:20:17.553000 audit[5363]: CRED_DISP pid=5363 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:20:17.564412 systemd[1]: sshd@25-10.0.3.156:22-4.153.228.146:39918.service: Deactivated successfully. Jan 16 21:20:17.566854 kernel: audit: type=1104 audit(1768598417.553:903): pid=5363 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:20:17.567577 systemd[1]: session-27.scope: Deactivated successfully. Jan 16 21:20:17.563000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@25-10.0.3.156:22-4.153.228.146:39918 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:20:17.569558 systemd-logind[1647]: Session 27 logged out. Waiting for processes to exit. Jan 16 21:20:17.570989 systemd-logind[1647]: Removed session 27. Jan 16 21:20:17.670763 kubelet[2882]: E0116 21:20:17.670727 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6787d9fcf6-mcbl2" podUID="eb6b7565-0bac-4e11-9918-28c46c9c8c58" Jan 16 21:20:19.671285 kubelet[2882]: E0116 21:20:19.671052 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6787d9fcf6-9lshv" podUID="ce4d12ad-802f-4a94-a69f-0be1648ae09b" Jan 16 21:20:22.663931 systemd[1]: Started sshd@26-10.0.3.156:22-4.153.228.146:39932.service - OpenSSH per-connection server daemon (4.153.228.146:39932). Jan 16 21:20:22.669106 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 16 21:20:22.669150 kernel: audit: type=1130 audit(1768598422.663:905): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@26-10.0.3.156:22-4.153.228.146:39932 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:20:22.663000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@26-10.0.3.156:22-4.153.228.146:39932 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:20:23.189000 audit[5378]: USER_ACCT pid=5378 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:20:23.193193 sshd[5378]: Accepted publickey for core from 4.153.228.146 port 39932 ssh2: RSA SHA256:bS5ZIeZwWVVoAGnX02c/6nLNZbVn3G5kxaKDLP4fvzs Jan 16 21:20:23.194897 sshd-session[5378]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 21:20:23.192000 audit[5378]: CRED_ACQ pid=5378 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:20:23.196413 kernel: audit: type=1101 audit(1768598423.189:906): pid=5378 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:20:23.196474 kernel: audit: type=1103 audit(1768598423.192:907): pid=5378 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:20:23.192000 audit[5378]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffca1b9ce40 a2=3 a3=0 items=0 ppid=1 pid=5378 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=28 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:20:23.204377 kernel: audit: type=1006 audit(1768598423.192:908): pid=5378 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=28 res=1 Jan 16 21:20:23.204440 kernel: audit: type=1300 audit(1768598423.192:908): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffca1b9ce40 a2=3 a3=0 items=0 ppid=1 pid=5378 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=28 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:20:23.192000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 21:20:23.207070 systemd-logind[1647]: New session 28 of user core. Jan 16 21:20:23.208185 kernel: audit: type=1327 audit(1768598423.192:908): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 21:20:23.213814 systemd[1]: Started session-28.scope - Session 28 of User core. Jan 16 21:20:23.222952 kernel: audit: type=1105 audit(1768598423.216:909): pid=5378 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:20:23.216000 audit[5378]: USER_START pid=5378 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:20:23.219000 audit[5384]: CRED_ACQ pid=5384 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:20:23.230651 kernel: audit: type=1103 audit(1768598423.219:910): pid=5384 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:20:23.547448 sshd[5384]: Connection closed by 4.153.228.146 port 39932 Jan 16 21:20:23.547226 sshd-session[5378]: pam_unix(sshd:session): session closed for user core Jan 16 21:20:23.549000 audit[5378]: USER_END pid=5378 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:20:23.555685 kernel: audit: type=1106 audit(1768598423.549:911): pid=5378 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:20:23.552555 systemd[1]: sshd@26-10.0.3.156:22-4.153.228.146:39932.service: Deactivated successfully. Jan 16 21:20:23.554501 systemd[1]: session-28.scope: Deactivated successfully. Jan 16 21:20:23.549000 audit[5378]: CRED_DISP pid=5378 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:20:23.559998 systemd-logind[1647]: Session 28 logged out. Waiting for processes to exit. Jan 16 21:20:23.551000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@26-10.0.3.156:22-4.153.228.146:39932 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 21:20:23.561260 kernel: audit: type=1104 audit(1768598423.549:912): pid=5378 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 21:20:23.561981 systemd-logind[1647]: Removed session 28. Jan 16 21:20:24.671051 kubelet[2882]: E0116 21:20:24.671017 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-56f896845d-kkjxr" podUID="91564960-b99d-4a80-9373-1c0eb8e68a7f" Jan 16 21:20:27.674846 kubelet[2882]: E0116 21:20:27.674143 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6b5c8f6bc4-rwplp" podUID="96381c87-0aa3-4b10-9efd-84c74923efa3" Jan 16 21:20:28.670209 kubelet[2882]: E0116 21:20:28.669800 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6787d9fcf6-mcbl2" podUID="eb6b7565-0bac-4e11-9918-28c46c9c8c58" Jan 16 21:20:28.670869 kubelet[2882]: E0116 21:20:28.670838 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-kql5z" podUID="a7efe392-df52-4d70-9d66-17372a93751c" Jan 16 21:20:30.670377 kubelet[2882]: E0116 21:20:30.670161 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-ns7rw" podUID="3048fb4b-f634-4014-ae59-9ad3946acc61" Jan 16 21:20:34.672135 containerd[1678]: time="2026-01-16T21:20:34.671870847Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 16 21:20:35.018812 containerd[1678]: time="2026-01-16T21:20:35.018378333Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 21:20:35.020758 containerd[1678]: time="2026-01-16T21:20:35.020633549Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 16 21:20:35.020758 containerd[1678]: time="2026-01-16T21:20:35.020682013Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 16 21:20:35.020923 kubelet[2882]: E0116 21:20:35.020866 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 16 21:20:35.021317 kubelet[2882]: E0116 21:20:35.020928 2882 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 16 21:20:35.021317 kubelet[2882]: E0116 21:20:35.021049 2882 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xdsvg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-6787d9fcf6-9lshv_calico-apiserver(ce4d12ad-802f-4a94-a69f-0be1648ae09b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 16 21:20:35.022388 kubelet[2882]: E0116 21:20:35.022349 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6787d9fcf6-9lshv" podUID="ce4d12ad-802f-4a94-a69f-0be1648ae09b" Jan 16 21:20:38.670265 containerd[1678]: time="2026-01-16T21:20:38.670225691Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 16 21:20:39.001332 containerd[1678]: time="2026-01-16T21:20:39.000701537Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 21:20:39.002724 containerd[1678]: time="2026-01-16T21:20:39.002622489Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 16 21:20:39.002724 containerd[1678]: time="2026-01-16T21:20:39.002677883Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 16 21:20:39.002882 kubelet[2882]: E0116 21:20:39.002835 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 16 21:20:39.003126 kubelet[2882]: E0116 21:20:39.002892 2882 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 16 21:20:39.003221 kubelet[2882]: E0116 21:20:39.003181 2882 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:4aba7cb893c2466794c6629c238ba8cf,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bntxj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-6b5c8f6bc4-rwplp_calico-system(96381c87-0aa3-4b10-9efd-84c74923efa3): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 16 21:20:39.005166 containerd[1678]: time="2026-01-16T21:20:39.005138823Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 16 21:20:39.341952 containerd[1678]: time="2026-01-16T21:20:39.341730109Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 21:20:39.343410 containerd[1678]: time="2026-01-16T21:20:39.343323836Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 16 21:20:39.343476 containerd[1678]: time="2026-01-16T21:20:39.343391483Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 16 21:20:39.343610 kubelet[2882]: E0116 21:20:39.343538 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 16 21:20:39.343610 kubelet[2882]: E0116 21:20:39.343586 2882 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 16 21:20:39.343754 kubelet[2882]: E0116 21:20:39.343707 2882 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bntxj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-6b5c8f6bc4-rwplp_calico-system(96381c87-0aa3-4b10-9efd-84c74923efa3): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 16 21:20:39.345052 kubelet[2882]: E0116 21:20:39.345004 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6b5c8f6bc4-rwplp" podUID="96381c87-0aa3-4b10-9efd-84c74923efa3" Jan 16 21:20:39.671567 containerd[1678]: time="2026-01-16T21:20:39.671481878Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 16 21:20:40.010087 containerd[1678]: time="2026-01-16T21:20:40.009968769Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 21:20:40.011688 containerd[1678]: time="2026-01-16T21:20:40.011616410Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 16 21:20:40.011688 containerd[1678]: time="2026-01-16T21:20:40.011654514Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 16 21:20:40.011924 kubelet[2882]: E0116 21:20:40.011858 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 16 21:20:40.011924 kubelet[2882]: E0116 21:20:40.011911 2882 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 16 21:20:40.012554 kubelet[2882]: E0116 21:20:40.012306 2882 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-btq9w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-56f896845d-kkjxr_calico-system(91564960-b99d-4a80-9373-1c0eb8e68a7f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 16 21:20:40.013548 kubelet[2882]: E0116 21:20:40.013520 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-56f896845d-kkjxr" podUID="91564960-b99d-4a80-9373-1c0eb8e68a7f" Jan 16 21:20:40.670253 containerd[1678]: time="2026-01-16T21:20:40.670149146Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 16 21:20:41.010574 containerd[1678]: time="2026-01-16T21:20:41.010430330Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 21:20:41.012337 containerd[1678]: time="2026-01-16T21:20:41.012192662Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 16 21:20:41.012446 containerd[1678]: time="2026-01-16T21:20:41.012251276Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 16 21:20:41.012580 kubelet[2882]: E0116 21:20:41.012547 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 16 21:20:41.012851 kubelet[2882]: E0116 21:20:41.012590 2882 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 16 21:20:41.013683 kubelet[2882]: E0116 21:20:41.013120 2882 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-72sgl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-6787d9fcf6-mcbl2_calico-apiserver(eb6b7565-0bac-4e11-9918-28c46c9c8c58): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 16 21:20:41.014944 kubelet[2882]: E0116 21:20:41.014813 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6787d9fcf6-mcbl2" podUID="eb6b7565-0bac-4e11-9918-28c46c9c8c58" Jan 16 21:20:43.672147 containerd[1678]: time="2026-01-16T21:20:43.672042394Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 16 21:20:44.023626 containerd[1678]: time="2026-01-16T21:20:44.023352429Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 21:20:44.024898 containerd[1678]: time="2026-01-16T21:20:44.024870801Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 16 21:20:44.024954 containerd[1678]: time="2026-01-16T21:20:44.024938587Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 16 21:20:44.025092 kubelet[2882]: E0116 21:20:44.025056 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 16 21:20:44.025370 kubelet[2882]: E0116 21:20:44.025098 2882 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 16 21:20:44.025370 kubelet[2882]: E0116 21:20:44.025209 2882 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8wg29,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-kql5z_calico-system(a7efe392-df52-4d70-9d66-17372a93751c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 16 21:20:44.027099 containerd[1678]: time="2026-01-16T21:20:44.026939530Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 16 21:20:44.389989 containerd[1678]: time="2026-01-16T21:20:44.389537702Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 21:20:44.391124 containerd[1678]: time="2026-01-16T21:20:44.391093121Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 16 21:20:44.391203 containerd[1678]: time="2026-01-16T21:20:44.391161615Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 16 21:20:44.391525 kubelet[2882]: E0116 21:20:44.391333 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 16 21:20:44.391771 kubelet[2882]: E0116 21:20:44.391591 2882 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 16 21:20:44.391771 kubelet[2882]: E0116 21:20:44.391719 2882 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8wg29,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-kql5z_calico-system(a7efe392-df52-4d70-9d66-17372a93751c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 16 21:20:44.392896 kubelet[2882]: E0116 21:20:44.392849 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-kql5z" podUID="a7efe392-df52-4d70-9d66-17372a93751c" Jan 16 21:20:44.671534 containerd[1678]: time="2026-01-16T21:20:44.671503010Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 16 21:20:45.015519 containerd[1678]: time="2026-01-16T21:20:45.015307167Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 21:20:45.017050 containerd[1678]: time="2026-01-16T21:20:45.017018848Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 16 21:20:45.017137 containerd[1678]: time="2026-01-16T21:20:45.017091143Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 16 21:20:45.017336 kubelet[2882]: E0116 21:20:45.017298 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 16 21:20:45.017768 kubelet[2882]: E0116 21:20:45.017554 2882 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 16 21:20:45.017987 kubelet[2882]: E0116 21:20:45.017929 2882 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-twtjv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-ns7rw_calico-system(3048fb4b-f634-4014-ae59-9ad3946acc61): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 16 21:20:45.019132 kubelet[2882]: E0116 21:20:45.019099 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-ns7rw" podUID="3048fb4b-f634-4014-ae59-9ad3946acc61" Jan 16 21:20:45.672439 kubelet[2882]: E0116 21:20:45.672369 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6787d9fcf6-9lshv" podUID="ce4d12ad-802f-4a94-a69f-0be1648ae09b" Jan 16 21:20:51.670971 kubelet[2882]: E0116 21:20:51.670928 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-56f896845d-kkjxr" podUID="91564960-b99d-4a80-9373-1c0eb8e68a7f" Jan 16 21:20:52.669684 kubelet[2882]: E0116 21:20:52.669639 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6787d9fcf6-mcbl2" podUID="eb6b7565-0bac-4e11-9918-28c46c9c8c58" Jan 16 21:20:52.670201 kubelet[2882]: E0116 21:20:52.670160 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6b5c8f6bc4-rwplp" podUID="96381c87-0aa3-4b10-9efd-84c74923efa3" Jan 16 21:20:54.836719 systemd[1]: cri-containerd-cc391d116d15810d51975eb4225ad31f2e3f71263bfdc54304997355dce9caae.scope: Deactivated successfully. Jan 16 21:20:54.837023 systemd[1]: cri-containerd-cc391d116d15810d51975eb4225ad31f2e3f71263bfdc54304997355dce9caae.scope: Consumed 2.980s CPU time, 63.1M memory peak, 192K read from disk. Jan 16 21:20:54.839647 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 16 21:20:54.839731 kernel: audit: type=1334 audit(1768598454.837:914): prog-id=256 op=LOAD Jan 16 21:20:54.837000 audit: BPF prog-id=256 op=LOAD Jan 16 21:20:54.840994 containerd[1678]: time="2026-01-16T21:20:54.840966260Z" level=info msg="received container exit event container_id:\"cc391d116d15810d51975eb4225ad31f2e3f71263bfdc54304997355dce9caae\" id:\"cc391d116d15810d51975eb4225ad31f2e3f71263bfdc54304997355dce9caae\" pid:2742 exit_status:1 exited_at:{seconds:1768598454 nanos:838714107}" Jan 16 21:20:54.837000 audit: BPF prog-id=93 op=UNLOAD Jan 16 21:20:54.844619 kernel: audit: type=1334 audit(1768598454.837:915): prog-id=93 op=UNLOAD Jan 16 21:20:54.841000 audit: BPF prog-id=108 op=UNLOAD Jan 16 21:20:54.846618 kernel: audit: type=1334 audit(1768598454.841:916): prog-id=108 op=UNLOAD Jan 16 21:20:54.841000 audit: BPF prog-id=112 op=UNLOAD Jan 16 21:20:54.848630 kernel: audit: type=1334 audit(1768598454.841:917): prog-id=112 op=UNLOAD Jan 16 21:20:54.866929 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-cc391d116d15810d51975eb4225ad31f2e3f71263bfdc54304997355dce9caae-rootfs.mount: Deactivated successfully. Jan 16 21:20:54.991893 update_engine[1649]: I20260116 21:20:54.991814 1649 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Jan 16 21:20:54.991893 update_engine[1649]: I20260116 21:20:54.991864 1649 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Jan 16 21:20:54.992320 update_engine[1649]: I20260116 21:20:54.992058 1649 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Jan 16 21:20:54.992431 update_engine[1649]: I20260116 21:20:54.992389 1649 omaha_request_params.cc:62] Current group set to developer Jan 16 21:20:54.992656 update_engine[1649]: I20260116 21:20:54.992488 1649 update_attempter.cc:499] Already updated boot flags. Skipping. Jan 16 21:20:54.992656 update_engine[1649]: I20260116 21:20:54.992498 1649 update_attempter.cc:643] Scheduling an action processor start. Jan 16 21:20:54.992656 update_engine[1649]: I20260116 21:20:54.992513 1649 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Jan 16 21:20:54.999739 locksmithd[1704]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Jan 16 21:20:55.004319 update_engine[1649]: I20260116 21:20:55.004271 1649 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Jan 16 21:20:55.004401 update_engine[1649]: I20260116 21:20:55.004365 1649 omaha_request_action.cc:271] Posting an Omaha request to disabled Jan 16 21:20:55.004401 update_engine[1649]: I20260116 21:20:55.004374 1649 omaha_request_action.cc:272] Request: Jan 16 21:20:55.004401 update_engine[1649]: Jan 16 21:20:55.004401 update_engine[1649]: Jan 16 21:20:55.004401 update_engine[1649]: Jan 16 21:20:55.004401 update_engine[1649]: Jan 16 21:20:55.004401 update_engine[1649]: Jan 16 21:20:55.004401 update_engine[1649]: Jan 16 21:20:55.004401 update_engine[1649]: Jan 16 21:20:55.004401 update_engine[1649]: Jan 16 21:20:55.004401 update_engine[1649]: I20260116 21:20:55.004380 1649 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 16 21:20:55.008463 update_engine[1649]: I20260116 21:20:55.008410 1649 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 16 21:20:55.009296 update_engine[1649]: I20260116 21:20:55.009240 1649 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 16 21:20:55.020567 update_engine[1649]: E20260116 21:20:55.020498 1649 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Jan 16 21:20:55.020705 update_engine[1649]: I20260116 21:20:55.020586 1649 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Jan 16 21:20:55.102632 kubelet[2882]: E0116 21:20:55.102410 2882 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.3.156:56410->10.0.3.147:2379: read: connection timed out" event="&Event{ObjectMeta:{goldmane-666569f655-ns7rw.188b52befb837332 calico-system 1361 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:calico-system,Name:goldmane-666569f655-ns7rw,UID:3048fb4b-f634-4014-ae59-9ad3946acc61,APIVersion:v1,ResourceVersion:780,FieldPath:spec.containers{goldmane},},Reason:Pulling,Message:Pulling image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\",Source:EventSource{Component:kubelet,Host:ci-4580-0-0-p-be73a47b79,},FirstTimestamp:2026-01-16 21:17:38 +0000 UTC,LastTimestamp:2026-01-16 21:20:44.670308557 +0000 UTC m=+229.105247414,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4580-0-0-p-be73a47b79,}" Jan 16 21:20:55.225865 kubelet[2882]: I0116 21:20:55.225840 2882 scope.go:117] "RemoveContainer" containerID="cc391d116d15810d51975eb4225ad31f2e3f71263bfdc54304997355dce9caae" Jan 16 21:20:55.228390 containerd[1678]: time="2026-01-16T21:20:55.228341759Z" level=info msg="CreateContainer within sandbox \"d4fcc08819c1599601745ba6dae5e6676dcb7b547a1d80e7c0cb4e5c7027cb28\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Jan 16 21:20:55.245439 containerd[1678]: time="2026-01-16T21:20:55.244736031Z" level=info msg="Container 7a7ff8afd3ac783459c0b8e30037300f455fe2b795748fc9eefa75dce9ef2743: CDI devices from CRI Config.CDIDevices: []" Jan 16 21:20:55.254731 containerd[1678]: time="2026-01-16T21:20:55.254692344Z" level=info msg="CreateContainer within sandbox \"d4fcc08819c1599601745ba6dae5e6676dcb7b547a1d80e7c0cb4e5c7027cb28\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"7a7ff8afd3ac783459c0b8e30037300f455fe2b795748fc9eefa75dce9ef2743\"" Jan 16 21:20:55.255360 containerd[1678]: time="2026-01-16T21:20:55.255180969Z" level=info msg="StartContainer for \"7a7ff8afd3ac783459c0b8e30037300f455fe2b795748fc9eefa75dce9ef2743\"" Jan 16 21:20:55.256318 containerd[1678]: time="2026-01-16T21:20:55.256298434Z" level=info msg="connecting to shim 7a7ff8afd3ac783459c0b8e30037300f455fe2b795748fc9eefa75dce9ef2743" address="unix:///run/containerd/s/cb3b851f073d8fd6cfbd47c5c855c488e7d07ee86f18b65007b780e2e35bcdc4" protocol=ttrpc version=3 Jan 16 21:20:55.275982 systemd[1]: Started cri-containerd-7a7ff8afd3ac783459c0b8e30037300f455fe2b795748fc9eefa75dce9ef2743.scope - libcontainer container 7a7ff8afd3ac783459c0b8e30037300f455fe2b795748fc9eefa75dce9ef2743. Jan 16 21:20:55.287000 audit: BPF prog-id=257 op=LOAD Jan 16 21:20:55.287000 audit: BPF prog-id=258 op=LOAD Jan 16 21:20:55.290806 kernel: audit: type=1334 audit(1768598455.287:918): prog-id=257 op=LOAD Jan 16 21:20:55.290860 kernel: audit: type=1334 audit(1768598455.287:919): prog-id=258 op=LOAD Jan 16 21:20:55.287000 audit[5454]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=2593 pid=5454 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:20:55.294015 kernel: audit: type=1300 audit(1768598455.287:919): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=2593 pid=5454 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:20:55.287000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3761376666386166643361633738333435396330623865333030333733 Jan 16 21:20:55.302617 kernel: audit: type=1327 audit(1768598455.287:919): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3761376666386166643361633738333435396330623865333030333733 Jan 16 21:20:55.303271 kubelet[2882]: E0116 21:20:55.303106 2882 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.3.156:56588->10.0.3.147:2379: read: connection timed out" Jan 16 21:20:55.287000 audit: BPF prog-id=258 op=UNLOAD Jan 16 21:20:55.287000 audit[5454]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2593 pid=5454 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:20:55.311365 kernel: audit: type=1334 audit(1768598455.287:920): prog-id=258 op=UNLOAD Jan 16 21:20:55.311405 kernel: audit: type=1300 audit(1768598455.287:920): arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2593 pid=5454 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:20:55.287000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3761376666386166643361633738333435396330623865333030333733 Jan 16 21:20:55.287000 audit: BPF prog-id=259 op=LOAD Jan 16 21:20:55.287000 audit[5454]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=2593 pid=5454 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:20:55.287000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3761376666386166643361633738333435396330623865333030333733 Jan 16 21:20:55.287000 audit: BPF prog-id=260 op=LOAD Jan 16 21:20:55.287000 audit[5454]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=2593 pid=5454 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:20:55.287000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3761376666386166643361633738333435396330623865333030333733 Jan 16 21:20:55.287000 audit: BPF prog-id=260 op=UNLOAD Jan 16 21:20:55.287000 audit[5454]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2593 pid=5454 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:20:55.287000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3761376666386166643361633738333435396330623865333030333733 Jan 16 21:20:55.287000 audit: BPF prog-id=259 op=UNLOAD Jan 16 21:20:55.287000 audit[5454]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2593 pid=5454 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:20:55.287000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3761376666386166643361633738333435396330623865333030333733 Jan 16 21:20:55.287000 audit: BPF prog-id=261 op=LOAD Jan 16 21:20:55.287000 audit[5454]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=2593 pid=5454 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:20:55.287000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3761376666386166643361633738333435396330623865333030333733 Jan 16 21:20:55.333313 containerd[1678]: time="2026-01-16T21:20:55.333188490Z" level=info msg="StartContainer for \"7a7ff8afd3ac783459c0b8e30037300f455fe2b795748fc9eefa75dce9ef2743\" returns successfully" Jan 16 21:20:55.999660 systemd[1]: cri-containerd-537f2e095ced960fb26bb2f9a49f7d461aec69f1e730fda4b7cb5b88faf3a24a.scope: Deactivated successfully. Jan 16 21:20:55.999947 systemd[1]: cri-containerd-537f2e095ced960fb26bb2f9a49f7d461aec69f1e730fda4b7cb5b88faf3a24a.scope: Consumed 26.042s CPU time, 111.1M memory peak. Jan 16 21:20:56.003093 containerd[1678]: time="2026-01-16T21:20:56.002924975Z" level=info msg="received container exit event container_id:\"537f2e095ced960fb26bb2f9a49f7d461aec69f1e730fda4b7cb5b88faf3a24a\" id:\"537f2e095ced960fb26bb2f9a49f7d461aec69f1e730fda4b7cb5b88faf3a24a\" pid:3217 exit_status:1 exited_at:{seconds:1768598456 nanos:2478987}" Jan 16 21:20:56.003000 audit: BPF prog-id=146 op=UNLOAD Jan 16 21:20:56.003000 audit: BPF prog-id=150 op=UNLOAD Jan 16 21:20:56.030462 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-537f2e095ced960fb26bb2f9a49f7d461aec69f1e730fda4b7cb5b88faf3a24a-rootfs.mount: Deactivated successfully. Jan 16 21:20:56.126702 kubelet[2882]: I0116 21:20:56.125829 2882 status_manager.go:890] "Failed to get status for pod" podUID="ce4d12ad-802f-4a94-a69f-0be1648ae09b" pod="calico-apiserver/calico-apiserver-6787d9fcf6-9lshv" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.3.156:56514->10.0.3.147:2379: read: connection timed out" Jan 16 21:20:56.229395 kubelet[2882]: I0116 21:20:56.229195 2882 scope.go:117] "RemoveContainer" containerID="537f2e095ced960fb26bb2f9a49f7d461aec69f1e730fda4b7cb5b88faf3a24a" Jan 16 21:20:56.233385 containerd[1678]: time="2026-01-16T21:20:56.232619455Z" level=info msg="CreateContainer within sandbox \"3777d1e49e5078a3ed4f807d7d317f14c02fad1f46a1a8ba0f0accda6bf305fb\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Jan 16 21:20:56.245885 containerd[1678]: time="2026-01-16T21:20:56.245844766Z" level=info msg="Container 31097f7f9141ba4bed6504ef02fdbf3a9aac9b29c2911843d1af31f33eaa1b53: CDI devices from CRI Config.CDIDevices: []" Jan 16 21:20:56.258508 containerd[1678]: time="2026-01-16T21:20:56.258427941Z" level=info msg="CreateContainer within sandbox \"3777d1e49e5078a3ed4f807d7d317f14c02fad1f46a1a8ba0f0accda6bf305fb\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"31097f7f9141ba4bed6504ef02fdbf3a9aac9b29c2911843d1af31f33eaa1b53\"" Jan 16 21:20:56.259453 containerd[1678]: time="2026-01-16T21:20:56.259431204Z" level=info msg="StartContainer for \"31097f7f9141ba4bed6504ef02fdbf3a9aac9b29c2911843d1af31f33eaa1b53\"" Jan 16 21:20:56.260690 containerd[1678]: time="2026-01-16T21:20:56.260666896Z" level=info msg="connecting to shim 31097f7f9141ba4bed6504ef02fdbf3a9aac9b29c2911843d1af31f33eaa1b53" address="unix:///run/containerd/s/d948cc3455b16aa1f3f6482c30f256b8abd7a6ae36bab4b9af1e6ba2486619e6" protocol=ttrpc version=3 Jan 16 21:20:56.283964 systemd[1]: Started cri-containerd-31097f7f9141ba4bed6504ef02fdbf3a9aac9b29c2911843d1af31f33eaa1b53.scope - libcontainer container 31097f7f9141ba4bed6504ef02fdbf3a9aac9b29c2911843d1af31f33eaa1b53. Jan 16 21:20:56.296000 audit: BPF prog-id=262 op=LOAD Jan 16 21:20:56.297000 audit: BPF prog-id=263 op=LOAD Jan 16 21:20:56.297000 audit[5497]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=3004 pid=5497 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:20:56.297000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3331303937663766393134316261346265643635303465663032666462 Jan 16 21:20:56.297000 audit: BPF prog-id=263 op=UNLOAD Jan 16 21:20:56.297000 audit[5497]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3004 pid=5497 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:20:56.297000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3331303937663766393134316261346265643635303465663032666462 Jan 16 21:20:56.297000 audit: BPF prog-id=264 op=LOAD Jan 16 21:20:56.297000 audit[5497]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=3004 pid=5497 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:20:56.297000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3331303937663766393134316261346265643635303465663032666462 Jan 16 21:20:56.297000 audit: BPF prog-id=265 op=LOAD Jan 16 21:20:56.297000 audit[5497]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=3004 pid=5497 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:20:56.297000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3331303937663766393134316261346265643635303465663032666462 Jan 16 21:20:56.297000 audit: BPF prog-id=265 op=UNLOAD Jan 16 21:20:56.297000 audit[5497]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3004 pid=5497 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:20:56.297000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3331303937663766393134316261346265643635303465663032666462 Jan 16 21:20:56.297000 audit: BPF prog-id=264 op=UNLOAD Jan 16 21:20:56.297000 audit[5497]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3004 pid=5497 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:20:56.297000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3331303937663766393134316261346265643635303465663032666462 Jan 16 21:20:56.298000 audit: BPF prog-id=266 op=LOAD Jan 16 21:20:56.298000 audit[5497]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=3004 pid=5497 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:20:56.298000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3331303937663766393134316261346265643635303465663032666462 Jan 16 21:20:56.333861 containerd[1678]: time="2026-01-16T21:20:56.333819815Z" level=info msg="StartContainer for \"31097f7f9141ba4bed6504ef02fdbf3a9aac9b29c2911843d1af31f33eaa1b53\" returns successfully" Jan 16 21:20:56.670211 kubelet[2882]: E0116 21:20:56.670126 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-ns7rw" podUID="3048fb4b-f634-4014-ae59-9ad3946acc61" Jan 16 21:20:58.670473 kubelet[2882]: E0116 21:20:58.670434 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6787d9fcf6-9lshv" podUID="ce4d12ad-802f-4a94-a69f-0be1648ae09b" Jan 16 21:20:59.671297 kubelet[2882]: E0116 21:20:59.671235 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-kql5z" podUID="a7efe392-df52-4d70-9d66-17372a93751c" Jan 16 21:21:00.543000 audit: BPF prog-id=267 op=LOAD Jan 16 21:21:00.542766 systemd[1]: cri-containerd-640093b186e811968b568e375d53a063a38403d48a6e351ff423d94d3fe0b435.scope: Deactivated successfully. Jan 16 21:21:00.543067 systemd[1]: cri-containerd-640093b186e811968b568e375d53a063a38403d48a6e351ff423d94d3fe0b435.scope: Consumed 2.329s CPU time, 21.6M memory peak, 128K read from disk. Jan 16 21:21:00.544964 kernel: kauditd_printk_skb: 40 callbacks suppressed Jan 16 21:21:00.545056 kernel: audit: type=1334 audit(1768598460.543:936): prog-id=267 op=LOAD Jan 16 21:21:00.546647 kernel: audit: type=1334 audit(1768598460.543:937): prog-id=88 op=UNLOAD Jan 16 21:21:00.543000 audit: BPF prog-id=88 op=UNLOAD Jan 16 21:21:00.548106 containerd[1678]: time="2026-01-16T21:21:00.547967007Z" level=info msg="received container exit event container_id:\"640093b186e811968b568e375d53a063a38403d48a6e351ff423d94d3fe0b435\" id:\"640093b186e811968b568e375d53a063a38403d48a6e351ff423d94d3fe0b435\" pid:2703 exit_status:1 exited_at:{seconds:1768598460 nanos:546905678}" Jan 16 21:21:00.547000 audit: BPF prog-id=98 op=UNLOAD Jan 16 21:21:00.548879 kernel: audit: type=1334 audit(1768598460.547:938): prog-id=98 op=UNLOAD Jan 16 21:21:00.547000 audit: BPF prog-id=102 op=UNLOAD Jan 16 21:21:00.551668 kernel: audit: type=1334 audit(1768598460.547:939): prog-id=102 op=UNLOAD Jan 16 21:21:00.571663 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-640093b186e811968b568e375d53a063a38403d48a6e351ff423d94d3fe0b435-rootfs.mount: Deactivated successfully. Jan 16 21:21:01.245449 kubelet[2882]: I0116 21:21:01.245417 2882 scope.go:117] "RemoveContainer" containerID="640093b186e811968b568e375d53a063a38403d48a6e351ff423d94d3fe0b435" Jan 16 21:21:01.248207 containerd[1678]: time="2026-01-16T21:21:01.248177493Z" level=info msg="CreateContainer within sandbox \"905553eceb2bbaf9d7d35a7d3e15c55531ed6930684ba1ea4b20824570cd4ff1\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Jan 16 21:21:01.272620 containerd[1678]: time="2026-01-16T21:21:01.271201541Z" level=info msg="Container b99f25001b29850057640ae105fb7beadf2ec48a2c075c76dce493e81033f1ae: CDI devices from CRI Config.CDIDevices: []" Jan 16 21:21:01.282393 containerd[1678]: time="2026-01-16T21:21:01.282283756Z" level=info msg="CreateContainer within sandbox \"905553eceb2bbaf9d7d35a7d3e15c55531ed6930684ba1ea4b20824570cd4ff1\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"b99f25001b29850057640ae105fb7beadf2ec48a2c075c76dce493e81033f1ae\"" Jan 16 21:21:01.282955 containerd[1678]: time="2026-01-16T21:21:01.282923318Z" level=info msg="StartContainer for \"b99f25001b29850057640ae105fb7beadf2ec48a2c075c76dce493e81033f1ae\"" Jan 16 21:21:01.283844 containerd[1678]: time="2026-01-16T21:21:01.283823571Z" level=info msg="connecting to shim b99f25001b29850057640ae105fb7beadf2ec48a2c075c76dce493e81033f1ae" address="unix:///run/containerd/s/2bbd2590fd781d3445f3b4b6268b14f4a48654f8eb55a43532a5fb6dcc3bf322" protocol=ttrpc version=3 Jan 16 21:21:01.301789 systemd[1]: Started cri-containerd-b99f25001b29850057640ae105fb7beadf2ec48a2c075c76dce493e81033f1ae.scope - libcontainer container b99f25001b29850057640ae105fb7beadf2ec48a2c075c76dce493e81033f1ae. Jan 16 21:21:01.312000 audit: BPF prog-id=268 op=LOAD Jan 16 21:21:01.314620 kernel: audit: type=1334 audit(1768598461.312:940): prog-id=268 op=LOAD Jan 16 21:21:01.314683 kernel: audit: type=1334 audit(1768598461.313:941): prog-id=269 op=LOAD Jan 16 21:21:01.313000 audit: BPF prog-id=269 op=LOAD Jan 16 21:21:01.313000 audit[5543]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=2570 pid=5543 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:21:01.317916 kernel: audit: type=1300 audit(1768598461.313:941): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=2570 pid=5543 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:21:01.313000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6239396632353030316232393835303035373634306165313035666237 Jan 16 21:21:01.322027 kernel: audit: type=1327 audit(1768598461.313:941): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6239396632353030316232393835303035373634306165313035666237 Jan 16 21:21:01.313000 audit: BPF prog-id=269 op=UNLOAD Jan 16 21:21:01.325094 kernel: audit: type=1334 audit(1768598461.313:942): prog-id=269 op=UNLOAD Jan 16 21:21:01.313000 audit[5543]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2570 pid=5543 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:21:01.327323 kernel: audit: type=1300 audit(1768598461.313:942): arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2570 pid=5543 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:21:01.313000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6239396632353030316232393835303035373634306165313035666237 Jan 16 21:21:01.315000 audit: BPF prog-id=270 op=LOAD Jan 16 21:21:01.315000 audit[5543]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=2570 pid=5543 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:21:01.315000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6239396632353030316232393835303035373634306165313035666237 Jan 16 21:21:01.315000 audit: BPF prog-id=271 op=LOAD Jan 16 21:21:01.315000 audit[5543]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=2570 pid=5543 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:21:01.315000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6239396632353030316232393835303035373634306165313035666237 Jan 16 21:21:01.315000 audit: BPF prog-id=271 op=UNLOAD Jan 16 21:21:01.315000 audit[5543]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2570 pid=5543 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:21:01.315000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6239396632353030316232393835303035373634306165313035666237 Jan 16 21:21:01.315000 audit: BPF prog-id=270 op=UNLOAD Jan 16 21:21:01.315000 audit[5543]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2570 pid=5543 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:21:01.315000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6239396632353030316232393835303035373634306165313035666237 Jan 16 21:21:01.315000 audit: BPF prog-id=272 op=LOAD Jan 16 21:21:01.315000 audit[5543]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=2570 pid=5543 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 21:21:01.315000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6239396632353030316232393835303035373634306165313035666237 Jan 16 21:21:01.361457 containerd[1678]: time="2026-01-16T21:21:01.361427190Z" level=info msg="StartContainer for \"b99f25001b29850057640ae105fb7beadf2ec48a2c075c76dce493e81033f1ae\" returns successfully" Jan 16 21:21:02.670131 kubelet[2882]: E0116 21:21:02.670095 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-56f896845d-kkjxr" podUID="91564960-b99d-4a80-9373-1c0eb8e68a7f" Jan 16 21:21:04.672013 kubelet[2882]: E0116 21:21:04.671967 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6b5c8f6bc4-rwplp" podUID="96381c87-0aa3-4b10-9efd-84c74923efa3" Jan 16 21:21:04.993471 update_engine[1649]: I20260116 21:21:04.992939 1649 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 16 21:21:04.993471 update_engine[1649]: I20260116 21:21:04.993041 1649 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 16 21:21:04.993471 update_engine[1649]: I20260116 21:21:04.993360 1649 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 16 21:21:05.002998 update_engine[1649]: E20260116 21:21:05.002866 1649 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Jan 16 21:21:05.002998 update_engine[1649]: I20260116 21:21:05.002966 1649 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Jan 16 21:21:05.304283 kubelet[2882]: E0116 21:21:05.303930 2882 controller.go:195] "Failed to update lease" err="Put \"https://10.0.3.156:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4580-0-0-p-be73a47b79?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)"