Jan 28 01:14:02.060277 kernel: Linux version 6.12.66-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.1_p20250801 p4) 14.3.1 20250801, GNU ld (Gentoo 2.45 p3) 2.45.0) #1 SMP PREEMPT_DYNAMIC Tue Jan 27 22:22:24 -00 2026 Jan 28 01:14:02.060314 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack verity.usrhash=71544b7bf64a92b2aba342c16b083723a12bedf106d3ddb24ccb63046196f1b3 Jan 28 01:14:02.060324 kernel: BIOS-provided physical RAM map: Jan 28 01:14:02.060331 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ffff] usable Jan 28 01:14:02.060337 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000007fffff] usable Jan 28 01:14:02.060343 kernel: BIOS-e820: [mem 0x0000000000800000-0x0000000000807fff] ACPI NVS Jan 28 01:14:02.060353 kernel: BIOS-e820: [mem 0x0000000000808000-0x000000000080afff] usable Jan 28 01:14:02.060359 kernel: BIOS-e820: [mem 0x000000000080b000-0x000000000080bfff] ACPI NVS Jan 28 01:14:02.060366 kernel: BIOS-e820: [mem 0x000000000080c000-0x0000000000810fff] usable Jan 28 01:14:02.060372 kernel: BIOS-e820: [mem 0x0000000000811000-0x00000000008fffff] ACPI NVS Jan 28 01:14:02.060378 kernel: BIOS-e820: [mem 0x0000000000900000-0x000000007e93efff] usable Jan 28 01:14:02.060385 kernel: BIOS-e820: [mem 0x000000007e93f000-0x000000007e9fffff] reserved Jan 28 01:14:02.060391 kernel: BIOS-e820: [mem 0x000000007ea00000-0x000000007ec70fff] usable Jan 28 01:14:02.060397 kernel: BIOS-e820: [mem 0x000000007ec71000-0x000000007ed84fff] reserved Jan 28 01:14:02.060407 kernel: BIOS-e820: [mem 0x000000007ed85000-0x000000007f8ecfff] usable Jan 28 01:14:02.060413 kernel: BIOS-e820: [mem 0x000000007f8ed000-0x000000007fb6cfff] reserved Jan 28 01:14:02.060420 kernel: BIOS-e820: [mem 0x000000007fb6d000-0x000000007fb7efff] ACPI data Jan 28 01:14:02.060427 kernel: BIOS-e820: [mem 0x000000007fb7f000-0x000000007fbfefff] ACPI NVS Jan 28 01:14:02.060433 kernel: BIOS-e820: [mem 0x000000007fbff000-0x000000007feaefff] usable Jan 28 01:14:02.060440 kernel: BIOS-e820: [mem 0x000000007feaf000-0x000000007feb2fff] reserved Jan 28 01:14:02.060448 kernel: BIOS-e820: [mem 0x000000007feb3000-0x000000007feb4fff] ACPI NVS Jan 28 01:14:02.060454 kernel: BIOS-e820: [mem 0x000000007feb5000-0x000000007feebfff] usable Jan 28 01:14:02.060461 kernel: BIOS-e820: [mem 0x000000007feec000-0x000000007ff6ffff] reserved Jan 28 01:14:02.060467 kernel: BIOS-e820: [mem 0x000000007ff70000-0x000000007fffffff] ACPI NVS Jan 28 01:14:02.060474 kernel: BIOS-e820: [mem 0x00000000e0000000-0x00000000efffffff] reserved Jan 28 01:14:02.060480 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Jan 28 01:14:02.060486 kernel: BIOS-e820: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Jan 28 01:14:02.060493 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000017fffffff] usable Jan 28 01:14:02.060499 kernel: NX (Execute Disable) protection: active Jan 28 01:14:02.060506 kernel: APIC: Static calls initialized Jan 28 01:14:02.060512 kernel: e820: update [mem 0x7df7f018-0x7df88a57] usable ==> usable Jan 28 01:14:02.060522 kernel: e820: update [mem 0x7df57018-0x7df7e457] usable ==> usable Jan 28 01:14:02.060528 kernel: extended physical RAM map: Jan 28 01:14:02.060535 kernel: reserve setup_data: [mem 0x0000000000000000-0x000000000009ffff] usable Jan 28 01:14:02.060542 kernel: reserve setup_data: [mem 0x0000000000100000-0x00000000007fffff] usable Jan 28 01:14:02.060548 kernel: reserve setup_data: [mem 0x0000000000800000-0x0000000000807fff] ACPI NVS Jan 28 01:14:02.060555 kernel: reserve setup_data: [mem 0x0000000000808000-0x000000000080afff] usable Jan 28 01:14:02.060561 kernel: reserve setup_data: [mem 0x000000000080b000-0x000000000080bfff] ACPI NVS Jan 28 01:14:02.060568 kernel: reserve setup_data: [mem 0x000000000080c000-0x0000000000810fff] usable Jan 28 01:14:02.060575 kernel: reserve setup_data: [mem 0x0000000000811000-0x00000000008fffff] ACPI NVS Jan 28 01:14:02.060587 kernel: reserve setup_data: [mem 0x0000000000900000-0x000000007df57017] usable Jan 28 01:14:02.060594 kernel: reserve setup_data: [mem 0x000000007df57018-0x000000007df7e457] usable Jan 28 01:14:02.060601 kernel: reserve setup_data: [mem 0x000000007df7e458-0x000000007df7f017] usable Jan 28 01:14:02.060608 kernel: reserve setup_data: [mem 0x000000007df7f018-0x000000007df88a57] usable Jan 28 01:14:02.060616 kernel: reserve setup_data: [mem 0x000000007df88a58-0x000000007e93efff] usable Jan 28 01:14:02.060623 kernel: reserve setup_data: [mem 0x000000007e93f000-0x000000007e9fffff] reserved Jan 28 01:14:02.060630 kernel: reserve setup_data: [mem 0x000000007ea00000-0x000000007ec70fff] usable Jan 28 01:14:02.060637 kernel: reserve setup_data: [mem 0x000000007ec71000-0x000000007ed84fff] reserved Jan 28 01:14:02.060644 kernel: reserve setup_data: [mem 0x000000007ed85000-0x000000007f8ecfff] usable Jan 28 01:14:02.060651 kernel: reserve setup_data: [mem 0x000000007f8ed000-0x000000007fb6cfff] reserved Jan 28 01:14:02.060658 kernel: reserve setup_data: [mem 0x000000007fb6d000-0x000000007fb7efff] ACPI data Jan 28 01:14:02.060665 kernel: reserve setup_data: [mem 0x000000007fb7f000-0x000000007fbfefff] ACPI NVS Jan 28 01:14:02.060672 kernel: reserve setup_data: [mem 0x000000007fbff000-0x000000007feaefff] usable Jan 28 01:14:02.060679 kernel: reserve setup_data: [mem 0x000000007feaf000-0x000000007feb2fff] reserved Jan 28 01:14:02.060685 kernel: reserve setup_data: [mem 0x000000007feb3000-0x000000007feb4fff] ACPI NVS Jan 28 01:14:02.060694 kernel: reserve setup_data: [mem 0x000000007feb5000-0x000000007feebfff] usable Jan 28 01:14:02.060701 kernel: reserve setup_data: [mem 0x000000007feec000-0x000000007ff6ffff] reserved Jan 28 01:14:02.060708 kernel: reserve setup_data: [mem 0x000000007ff70000-0x000000007fffffff] ACPI NVS Jan 28 01:14:02.060715 kernel: reserve setup_data: [mem 0x00000000e0000000-0x00000000efffffff] reserved Jan 28 01:14:02.060722 kernel: reserve setup_data: [mem 0x00000000feffc000-0x00000000feffffff] reserved Jan 28 01:14:02.060729 kernel: reserve setup_data: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Jan 28 01:14:02.060736 kernel: reserve setup_data: [mem 0x0000000100000000-0x000000017fffffff] usable Jan 28 01:14:02.060743 kernel: efi: EFI v2.7 by EDK II Jan 28 01:14:02.060750 kernel: efi: SMBIOS=0x7f972000 ACPI=0x7fb7e000 ACPI 2.0=0x7fb7e014 MEMATTR=0x7dfd8018 RNG=0x7fb72018 Jan 28 01:14:02.060757 kernel: random: crng init done Jan 28 01:14:02.060764 kernel: efi: Remove mem139: MMIO range=[0xffc00000-0xffffffff] (4MB) from e820 map Jan 28 01:14:02.060772 kernel: e820: remove [mem 0xffc00000-0xffffffff] reserved Jan 28 01:14:02.060779 kernel: secureboot: Secure boot disabled Jan 28 01:14:02.060786 kernel: SMBIOS 2.8 present. Jan 28 01:14:02.060793 kernel: DMI: STACKIT Cloud OpenStack Nova/Standard PC (Q35 + ICH9, 2009), BIOS unknown 02/02/2022 Jan 28 01:14:02.060800 kernel: DMI: Memory slots populated: 1/1 Jan 28 01:14:02.060807 kernel: Hypervisor detected: KVM Jan 28 01:14:02.060814 kernel: last_pfn = 0x7feec max_arch_pfn = 0x10000000000 Jan 28 01:14:02.060821 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Jan 28 01:14:02.060828 kernel: kvm-clock: using sched offset of 5584699804 cycles Jan 28 01:14:02.060836 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Jan 28 01:14:02.060846 kernel: tsc: Detected 2294.608 MHz processor Jan 28 01:14:02.060853 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Jan 28 01:14:02.060861 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Jan 28 01:14:02.060868 kernel: last_pfn = 0x180000 max_arch_pfn = 0x10000000000 Jan 28 01:14:02.060876 kernel: MTRR map: 4 entries (2 fixed + 2 variable; max 18), built from 8 variable MTRRs Jan 28 01:14:02.060884 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Jan 28 01:14:02.060891 kernel: last_pfn = 0x7feec max_arch_pfn = 0x10000000000 Jan 28 01:14:02.060899 kernel: Using GB pages for direct mapping Jan 28 01:14:02.060908 kernel: ACPI: Early table checksum verification disabled Jan 28 01:14:02.060916 kernel: ACPI: RSDP 0x000000007FB7E014 000024 (v02 BOCHS ) Jan 28 01:14:02.060923 kernel: ACPI: XSDT 0x000000007FB7D0E8 00004C (v01 BOCHS BXPC 00000001 01000013) Jan 28 01:14:02.060931 kernel: ACPI: FACP 0x000000007FB77000 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 28 01:14:02.060938 kernel: ACPI: DSDT 0x000000007FB78000 00423C (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 28 01:14:02.060946 kernel: ACPI: FACS 0x000000007FBDD000 000040 Jan 28 01:14:02.060953 kernel: ACPI: APIC 0x000000007FB76000 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 28 01:14:02.060963 kernel: ACPI: MCFG 0x000000007FB75000 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 28 01:14:02.060970 kernel: ACPI: WAET 0x000000007FB74000 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 28 01:14:02.060978 kernel: ACPI: BGRT 0x000000007FB73000 000038 (v01 INTEL EDK2 00000002 01000013) Jan 28 01:14:02.060985 kernel: ACPI: Reserving FACP table memory at [mem 0x7fb77000-0x7fb770f3] Jan 28 01:14:02.060992 kernel: ACPI: Reserving DSDT table memory at [mem 0x7fb78000-0x7fb7c23b] Jan 28 01:14:02.061000 kernel: ACPI: Reserving FACS table memory at [mem 0x7fbdd000-0x7fbdd03f] Jan 28 01:14:02.062958 kernel: ACPI: Reserving APIC table memory at [mem 0x7fb76000-0x7fb7607f] Jan 28 01:14:02.062974 kernel: ACPI: Reserving MCFG table memory at [mem 0x7fb75000-0x7fb7503b] Jan 28 01:14:02.062985 kernel: ACPI: Reserving WAET table memory at [mem 0x7fb74000-0x7fb74027] Jan 28 01:14:02.062994 kernel: ACPI: Reserving BGRT table memory at [mem 0x7fb73000-0x7fb73037] Jan 28 01:14:02.063018 kernel: No NUMA configuration found Jan 28 01:14:02.063033 kernel: Faking a node at [mem 0x0000000000000000-0x000000017fffffff] Jan 28 01:14:02.063046 kernel: NODE_DATA(0) allocated [mem 0x17fff6dc0-0x17fffdfff] Jan 28 01:14:02.063055 kernel: Zone ranges: Jan 28 01:14:02.063063 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Jan 28 01:14:02.063073 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Jan 28 01:14:02.063081 kernel: Normal [mem 0x0000000100000000-0x000000017fffffff] Jan 28 01:14:02.063089 kernel: Device empty Jan 28 01:14:02.063096 kernel: Movable zone start for each node Jan 28 01:14:02.063104 kernel: Early memory node ranges Jan 28 01:14:02.063111 kernel: node 0: [mem 0x0000000000001000-0x000000000009ffff] Jan 28 01:14:02.063118 kernel: node 0: [mem 0x0000000000100000-0x00000000007fffff] Jan 28 01:14:02.063129 kernel: node 0: [mem 0x0000000000808000-0x000000000080afff] Jan 28 01:14:02.063136 kernel: node 0: [mem 0x000000000080c000-0x0000000000810fff] Jan 28 01:14:02.063143 kernel: node 0: [mem 0x0000000000900000-0x000000007e93efff] Jan 28 01:14:02.063151 kernel: node 0: [mem 0x000000007ea00000-0x000000007ec70fff] Jan 28 01:14:02.063159 kernel: node 0: [mem 0x000000007ed85000-0x000000007f8ecfff] Jan 28 01:14:02.063174 kernel: node 0: [mem 0x000000007fbff000-0x000000007feaefff] Jan 28 01:14:02.063183 kernel: node 0: [mem 0x000000007feb5000-0x000000007feebfff] Jan 28 01:14:02.063191 kernel: node 0: [mem 0x0000000100000000-0x000000017fffffff] Jan 28 01:14:02.063200 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000017fffffff] Jan 28 01:14:02.063208 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Jan 28 01:14:02.063218 kernel: On node 0, zone DMA: 96 pages in unavailable ranges Jan 28 01:14:02.063226 kernel: On node 0, zone DMA: 8 pages in unavailable ranges Jan 28 01:14:02.063234 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Jan 28 01:14:02.063242 kernel: On node 0, zone DMA: 239 pages in unavailable ranges Jan 28 01:14:02.063252 kernel: On node 0, zone DMA32: 193 pages in unavailable ranges Jan 28 01:14:02.063261 kernel: On node 0, zone DMA32: 276 pages in unavailable ranges Jan 28 01:14:02.063269 kernel: On node 0, zone DMA32: 786 pages in unavailable ranges Jan 28 01:14:02.063277 kernel: On node 0, zone DMA32: 6 pages in unavailable ranges Jan 28 01:14:02.063285 kernel: On node 0, zone Normal: 276 pages in unavailable ranges Jan 28 01:14:02.063294 kernel: ACPI: PM-Timer IO Port: 0x608 Jan 28 01:14:02.063302 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Jan 28 01:14:02.063312 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Jan 28 01:14:02.063320 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Jan 28 01:14:02.063328 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Jan 28 01:14:02.063337 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Jan 28 01:14:02.063345 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Jan 28 01:14:02.063353 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Jan 28 01:14:02.063361 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Jan 28 01:14:02.063370 kernel: TSC deadline timer available Jan 28 01:14:02.063379 kernel: CPU topo: Max. logical packages: 2 Jan 28 01:14:02.063387 kernel: CPU topo: Max. logical dies: 2 Jan 28 01:14:02.063395 kernel: CPU topo: Max. dies per package: 1 Jan 28 01:14:02.063403 kernel: CPU topo: Max. threads per core: 1 Jan 28 01:14:02.063411 kernel: CPU topo: Num. cores per package: 1 Jan 28 01:14:02.063419 kernel: CPU topo: Num. threads per package: 1 Jan 28 01:14:02.063427 kernel: CPU topo: Allowing 2 present CPUs plus 0 hotplug CPUs Jan 28 01:14:02.063437 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Jan 28 01:14:02.063445 kernel: kvm-guest: KVM setup pv remote TLB flush Jan 28 01:14:02.063453 kernel: kvm-guest: setup PV sched yield Jan 28 01:14:02.063461 kernel: [mem 0x80000000-0xdfffffff] available for PCI devices Jan 28 01:14:02.063469 kernel: Booting paravirtualized kernel on KVM Jan 28 01:14:02.063478 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Jan 28 01:14:02.063486 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Jan 28 01:14:02.063496 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u1048576 Jan 28 01:14:02.063505 kernel: pcpu-alloc: s207832 r8192 d29736 u1048576 alloc=1*2097152 Jan 28 01:14:02.063513 kernel: pcpu-alloc: [0] 0 1 Jan 28 01:14:02.063521 kernel: kvm-guest: PV spinlocks enabled Jan 28 01:14:02.063529 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Jan 28 01:14:02.063538 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack verity.usrhash=71544b7bf64a92b2aba342c16b083723a12bedf106d3ddb24ccb63046196f1b3 Jan 28 01:14:02.063547 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Jan 28 01:14:02.063557 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Jan 28 01:14:02.063565 kernel: Fallback order for Node 0: 0 Jan 28 01:14:02.063573 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1046694 Jan 28 01:14:02.063582 kernel: Policy zone: Normal Jan 28 01:14:02.063590 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Jan 28 01:14:02.063598 kernel: software IO TLB: area num 2. Jan 28 01:14:02.063606 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Jan 28 01:14:02.063616 kernel: ftrace: allocating 40128 entries in 157 pages Jan 28 01:14:02.063624 kernel: ftrace: allocated 157 pages with 5 groups Jan 28 01:14:02.063632 kernel: Dynamic Preempt: voluntary Jan 28 01:14:02.063640 kernel: rcu: Preemptible hierarchical RCU implementation. Jan 28 01:14:02.063681 kernel: rcu: RCU event tracing is enabled. Jan 28 01:14:02.063690 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Jan 28 01:14:02.063698 kernel: Trampoline variant of Tasks RCU enabled. Jan 28 01:14:02.063709 kernel: Rude variant of Tasks RCU enabled. Jan 28 01:14:02.063717 kernel: Tracing variant of Tasks RCU enabled. Jan 28 01:14:02.063725 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Jan 28 01:14:02.063733 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Jan 28 01:14:02.063742 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jan 28 01:14:02.063750 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jan 28 01:14:02.063758 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jan 28 01:14:02.063769 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 Jan 28 01:14:02.063777 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Jan 28 01:14:02.063785 kernel: Console: colour dummy device 80x25 Jan 28 01:14:02.063793 kernel: printk: legacy console [tty0] enabled Jan 28 01:14:02.063802 kernel: printk: legacy console [ttyS0] enabled Jan 28 01:14:02.063810 kernel: ACPI: Core revision 20240827 Jan 28 01:14:02.063818 kernel: APIC: Switch to symmetric I/O mode setup Jan 28 01:14:02.063826 kernel: x2apic enabled Jan 28 01:14:02.063837 kernel: APIC: Switched APIC routing to: physical x2apic Jan 28 01:14:02.063845 kernel: kvm-guest: APIC: send_IPI_mask() replaced with kvm_send_ipi_mask() Jan 28 01:14:02.063853 kernel: kvm-guest: APIC: send_IPI_mask_allbutself() replaced with kvm_send_ipi_mask_allbutself() Jan 28 01:14:02.063861 kernel: kvm-guest: setup PV IPIs Jan 28 01:14:02.063869 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x21134f58f0d, max_idle_ns: 440795217993 ns Jan 28 01:14:02.063877 kernel: Calibrating delay loop (skipped) preset value.. 4589.21 BogoMIPS (lpj=2294608) Jan 28 01:14:02.063885 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Jan 28 01:14:02.063896 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 Jan 28 01:14:02.063904 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 Jan 28 01:14:02.063911 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Jan 28 01:14:02.063919 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall and VM exit Jan 28 01:14:02.063926 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Jan 28 01:14:02.063934 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Jan 28 01:14:02.063941 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Jan 28 01:14:02.063949 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Jan 28 01:14:02.063956 kernel: TAA: Mitigation: Clear CPU buffers Jan 28 01:14:02.063964 kernel: MMIO Stale Data: Mitigation: Clear CPU buffers Jan 28 01:14:02.063973 kernel: active return thunk: its_return_thunk Jan 28 01:14:02.063981 kernel: ITS: Mitigation: Aligned branch/return thunks Jan 28 01:14:02.063988 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Jan 28 01:14:02.063996 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Jan 28 01:14:02.064010 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Jan 28 01:14:02.064018 kernel: x86/fpu: Supporting XSAVE feature 0x020: 'AVX-512 opmask' Jan 28 01:14:02.064026 kernel: x86/fpu: Supporting XSAVE feature 0x040: 'AVX-512 Hi256' Jan 28 01:14:02.064033 kernel: x86/fpu: Supporting XSAVE feature 0x080: 'AVX-512 ZMM_Hi256' Jan 28 01:14:02.064041 kernel: x86/fpu: Supporting XSAVE feature 0x200: 'Protection Keys User registers' Jan 28 01:14:02.064050 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Jan 28 01:14:02.064058 kernel: x86/fpu: xstate_offset[5]: 832, xstate_sizes[5]: 64 Jan 28 01:14:02.064065 kernel: x86/fpu: xstate_offset[6]: 896, xstate_sizes[6]: 512 Jan 28 01:14:02.064073 kernel: x86/fpu: xstate_offset[7]: 1408, xstate_sizes[7]: 1024 Jan 28 01:14:02.064080 kernel: x86/fpu: xstate_offset[9]: 2432, xstate_sizes[9]: 8 Jan 28 01:14:02.064088 kernel: x86/fpu: Enabled xstate features 0x2e7, context size is 2440 bytes, using 'compacted' format. Jan 28 01:14:02.064096 kernel: Freeing SMP alternatives memory: 32K Jan 28 01:14:02.064103 kernel: pid_max: default: 32768 minimum: 301 Jan 28 01:14:02.064110 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Jan 28 01:14:02.064118 kernel: landlock: Up and running. Jan 28 01:14:02.064125 kernel: SELinux: Initializing. Jan 28 01:14:02.064133 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Jan 28 01:14:02.064142 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Jan 28 01:14:02.064150 kernel: smpboot: CPU0: Intel(R) Xeon(R) Silver 4316 CPU @ 2.30GHz (family: 0x6, model: 0x6a, stepping: 0x6) Jan 28 01:14:02.064158 kernel: Performance Events: PEBS fmt0-, Icelake events, full-width counters, Intel PMU driver. Jan 28 01:14:02.064166 kernel: ... version: 2 Jan 28 01:14:02.064174 kernel: ... bit width: 48 Jan 28 01:14:02.064182 kernel: ... generic registers: 8 Jan 28 01:14:02.064190 kernel: ... value mask: 0000ffffffffffff Jan 28 01:14:02.064198 kernel: ... max period: 00007fffffffffff Jan 28 01:14:02.064208 kernel: ... fixed-purpose events: 3 Jan 28 01:14:02.064217 kernel: ... event mask: 00000007000000ff Jan 28 01:14:02.064225 kernel: signal: max sigframe size: 3632 Jan 28 01:14:02.064233 kernel: rcu: Hierarchical SRCU implementation. Jan 28 01:14:02.064241 kernel: rcu: Max phase no-delay instances is 400. Jan 28 01:14:02.064249 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Jan 28 01:14:02.064257 kernel: smp: Bringing up secondary CPUs ... Jan 28 01:14:02.064267 kernel: smpboot: x86: Booting SMP configuration: Jan 28 01:14:02.064276 kernel: .... node #0, CPUs: #1 Jan 28 01:14:02.064284 kernel: smp: Brought up 1 node, 2 CPUs Jan 28 01:14:02.064292 kernel: smpboot: Total of 2 processors activated (9178.43 BogoMIPS) Jan 28 01:14:02.064300 kernel: Memory: 3969760K/4186776K available (14336K kernel code, 2445K rwdata, 31644K rodata, 15536K init, 2500K bss, 212140K reserved, 0K cma-reserved) Jan 28 01:14:02.064308 kernel: devtmpfs: initialized Jan 28 01:14:02.064317 kernel: x86/mm: Memory block size: 128MB Jan 28 01:14:02.064326 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x00800000-0x00807fff] (32768 bytes) Jan 28 01:14:02.064334 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x0080b000-0x0080bfff] (4096 bytes) Jan 28 01:14:02.064342 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x00811000-0x008fffff] (978944 bytes) Jan 28 01:14:02.064351 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7fb7f000-0x7fbfefff] (524288 bytes) Jan 28 01:14:02.064359 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7feb3000-0x7feb4fff] (8192 bytes) Jan 28 01:14:02.064367 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7ff70000-0x7fffffff] (589824 bytes) Jan 28 01:14:02.064375 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Jan 28 01:14:02.064385 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Jan 28 01:14:02.064393 kernel: pinctrl core: initialized pinctrl subsystem Jan 28 01:14:02.064401 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Jan 28 01:14:02.064409 kernel: audit: initializing netlink subsys (disabled) Jan 28 01:14:02.064417 kernel: audit: type=2000 audit(1769562838.384:1): state=initialized audit_enabled=0 res=1 Jan 28 01:14:02.064425 kernel: thermal_sys: Registered thermal governor 'step_wise' Jan 28 01:14:02.064433 kernel: thermal_sys: Registered thermal governor 'user_space' Jan 28 01:14:02.064443 kernel: cpuidle: using governor menu Jan 28 01:14:02.064451 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Jan 28 01:14:02.064459 kernel: dca service started, version 1.12.1 Jan 28 01:14:02.064467 kernel: PCI: ECAM [mem 0xe0000000-0xefffffff] (base 0xe0000000) for domain 0000 [bus 00-ff] Jan 28 01:14:02.064475 kernel: PCI: Using configuration type 1 for base access Jan 28 01:14:02.064483 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Jan 28 01:14:02.064491 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Jan 28 01:14:02.064501 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Jan 28 01:14:02.064509 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Jan 28 01:14:02.064517 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Jan 28 01:14:02.064525 kernel: ACPI: Added _OSI(Module Device) Jan 28 01:14:02.064533 kernel: ACPI: Added _OSI(Processor Device) Jan 28 01:14:02.064541 kernel: ACPI: Added _OSI(Processor Aggregator Device) Jan 28 01:14:02.064549 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Jan 28 01:14:02.064559 kernel: ACPI: Interpreter enabled Jan 28 01:14:02.064567 kernel: ACPI: PM: (supports S0 S3 S5) Jan 28 01:14:02.064575 kernel: ACPI: Using IOAPIC for interrupt routing Jan 28 01:14:02.064583 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Jan 28 01:14:02.064592 kernel: PCI: Using E820 reservations for host bridge windows Jan 28 01:14:02.064600 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Jan 28 01:14:02.064608 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Jan 28 01:14:02.064810 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Jan 28 01:14:02.064916 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Jan 28 01:14:02.065152 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Jan 28 01:14:02.065165 kernel: PCI host bridge to bus 0000:00 Jan 28 01:14:02.065280 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Jan 28 01:14:02.065371 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Jan 28 01:14:02.065461 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Jan 28 01:14:02.065549 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xdfffffff window] Jan 28 01:14:02.065636 kernel: pci_bus 0000:00: root bus resource [mem 0xf0000000-0xfebfffff window] Jan 28 01:14:02.065722 kernel: pci_bus 0000:00: root bus resource [mem 0x380000000000-0x38e800003fff window] Jan 28 01:14:02.065809 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Jan 28 01:14:02.065927 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint Jan 28 01:14:02.066088 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint Jan 28 01:14:02.066192 kernel: pci 0000:00:01.0: BAR 0 [mem 0x80000000-0x807fffff pref] Jan 28 01:14:02.066294 kernel: pci 0000:00:01.0: BAR 2 [mem 0x38e800000000-0x38e800003fff 64bit pref] Jan 28 01:14:02.066390 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8439e000-0x8439efff] Jan 28 01:14:02.066486 kernel: pci 0000:00:01.0: ROM [mem 0xffff0000-0xffffffff pref] Jan 28 01:14:02.066586 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Jan 28 01:14:02.066693 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 28 01:14:02.066791 kernel: pci 0000:00:02.0: BAR 0 [mem 0x8439d000-0x8439dfff] Jan 28 01:14:02.066888 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Jan 28 01:14:02.066985 kernel: pci 0000:00:02.0: bridge window [io 0x6000-0x6fff] Jan 28 01:14:02.067102 kernel: pci 0000:00:02.0: bridge window [mem 0x84000000-0x842fffff] Jan 28 01:14:02.067200 kernel: pci 0000:00:02.0: bridge window [mem 0x380000000000-0x3807ffffffff 64bit pref] Jan 28 01:14:02.067313 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 28 01:14:02.067411 kernel: pci 0000:00:02.1: BAR 0 [mem 0x8439c000-0x8439cfff] Jan 28 01:14:02.067509 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Jan 28 01:14:02.067606 kernel: pci 0000:00:02.1: bridge window [mem 0x83e00000-0x83ffffff] Jan 28 01:14:02.067719 kernel: pci 0000:00:02.1: bridge window [mem 0x380800000000-0x380fffffffff 64bit pref] Jan 28 01:14:02.067828 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 28 01:14:02.067928 kernel: pci 0000:00:02.2: BAR 0 [mem 0x8439b000-0x8439bfff] Jan 28 01:14:02.068036 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Jan 28 01:14:02.068133 kernel: pci 0000:00:02.2: bridge window [mem 0x83c00000-0x83dfffff] Jan 28 01:14:02.068228 kernel: pci 0000:00:02.2: bridge window [mem 0x381000000000-0x3817ffffffff 64bit pref] Jan 28 01:14:02.068337 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 28 01:14:02.068436 kernel: pci 0000:00:02.3: BAR 0 [mem 0x8439a000-0x8439afff] Jan 28 01:14:02.068535 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Jan 28 01:14:02.068633 kernel: pci 0000:00:02.3: bridge window [mem 0x83a00000-0x83bfffff] Jan 28 01:14:02.068730 kernel: pci 0000:00:02.3: bridge window [mem 0x381800000000-0x381fffffffff 64bit pref] Jan 28 01:14:02.068840 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 28 01:14:02.068939 kernel: pci 0000:00:02.4: BAR 0 [mem 0x84399000-0x84399fff] Jan 28 01:14:02.069055 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Jan 28 01:14:02.069155 kernel: pci 0000:00:02.4: bridge window [mem 0x83800000-0x839fffff] Jan 28 01:14:02.069251 kernel: pci 0000:00:02.4: bridge window [mem 0x382000000000-0x3827ffffffff 64bit pref] Jan 28 01:14:02.069358 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 28 01:14:02.069454 kernel: pci 0000:00:02.5: BAR 0 [mem 0x84398000-0x84398fff] Jan 28 01:14:02.069553 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Jan 28 01:14:02.069648 kernel: pci 0000:00:02.5: bridge window [mem 0x83600000-0x837fffff] Jan 28 01:14:02.069742 kernel: pci 0000:00:02.5: bridge window [mem 0x382800000000-0x382fffffffff 64bit pref] Jan 28 01:14:02.069846 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 28 01:14:02.069943 kernel: pci 0000:00:02.6: BAR 0 [mem 0x84397000-0x84397fff] Jan 28 01:14:02.070064 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Jan 28 01:14:02.070163 kernel: pci 0000:00:02.6: bridge window [mem 0x83400000-0x835fffff] Jan 28 01:14:02.070258 kernel: pci 0000:00:02.6: bridge window [mem 0x383000000000-0x3837ffffffff 64bit pref] Jan 28 01:14:02.070360 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 28 01:14:02.070457 kernel: pci 0000:00:02.7: BAR 0 [mem 0x84396000-0x84396fff] Jan 28 01:14:02.070555 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Jan 28 01:14:02.070652 kernel: pci 0000:00:02.7: bridge window [mem 0x83200000-0x833fffff] Jan 28 01:14:02.070748 kernel: pci 0000:00:02.7: bridge window [mem 0x383800000000-0x383fffffffff 64bit pref] Jan 28 01:14:02.070851 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 28 01:14:02.070948 kernel: pci 0000:00:03.0: BAR 0 [mem 0x84395000-0x84395fff] Jan 28 01:14:02.071058 kernel: pci 0000:00:03.0: PCI bridge to [bus 0a] Jan 28 01:14:02.071161 kernel: pci 0000:00:03.0: bridge window [mem 0x83000000-0x831fffff] Jan 28 01:14:02.071257 kernel: pci 0000:00:03.0: bridge window [mem 0x384000000000-0x3847ffffffff 64bit pref] Jan 28 01:14:02.071368 kernel: pci 0000:00:03.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 28 01:14:02.071479 kernel: pci 0000:00:03.1: BAR 0 [mem 0x84394000-0x84394fff] Jan 28 01:14:02.071578 kernel: pci 0000:00:03.1: PCI bridge to [bus 0b] Jan 28 01:14:02.071686 kernel: pci 0000:00:03.1: bridge window [mem 0x82e00000-0x82ffffff] Jan 28 01:14:02.071786 kernel: pci 0000:00:03.1: bridge window [mem 0x384800000000-0x384fffffffff 64bit pref] Jan 28 01:14:02.071897 kernel: pci 0000:00:03.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 28 01:14:02.071995 kernel: pci 0000:00:03.2: BAR 0 [mem 0x84393000-0x84393fff] Jan 28 01:14:02.072105 kernel: pci 0000:00:03.2: PCI bridge to [bus 0c] Jan 28 01:14:02.072201 kernel: pci 0000:00:03.2: bridge window [mem 0x82c00000-0x82dfffff] Jan 28 01:14:02.072297 kernel: pci 0000:00:03.2: bridge window [mem 0x385000000000-0x3857ffffffff 64bit pref] Jan 28 01:14:02.072403 kernel: pci 0000:00:03.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 28 01:14:02.072499 kernel: pci 0000:00:03.3: BAR 0 [mem 0x84392000-0x84392fff] Jan 28 01:14:02.072597 kernel: pci 0000:00:03.3: PCI bridge to [bus 0d] Jan 28 01:14:02.072693 kernel: pci 0000:00:03.3: bridge window [mem 0x82a00000-0x82bfffff] Jan 28 01:14:02.072793 kernel: pci 0000:00:03.3: bridge window [mem 0x385800000000-0x385fffffffff 64bit pref] Jan 28 01:14:02.072896 kernel: pci 0000:00:03.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 28 01:14:02.072993 kernel: pci 0000:00:03.4: BAR 0 [mem 0x84391000-0x84391fff] Jan 28 01:14:02.073097 kernel: pci 0000:00:03.4: PCI bridge to [bus 0e] Jan 28 01:14:02.073192 kernel: pci 0000:00:03.4: bridge window [mem 0x82800000-0x829fffff] Jan 28 01:14:02.073287 kernel: pci 0000:00:03.4: bridge window [mem 0x386000000000-0x3867ffffffff 64bit pref] Jan 28 01:14:02.073397 kernel: pci 0000:00:03.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 28 01:14:02.073494 kernel: pci 0000:00:03.5: BAR 0 [mem 0x84390000-0x84390fff] Jan 28 01:14:02.073590 kernel: pci 0000:00:03.5: PCI bridge to [bus 0f] Jan 28 01:14:02.073685 kernel: pci 0000:00:03.5: bridge window [mem 0x82600000-0x827fffff] Jan 28 01:14:02.073779 kernel: pci 0000:00:03.5: bridge window [mem 0x386800000000-0x386fffffffff 64bit pref] Jan 28 01:14:02.073887 kernel: pci 0000:00:03.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 28 01:14:02.073986 kernel: pci 0000:00:03.6: BAR 0 [mem 0x8438f000-0x8438ffff] Jan 28 01:14:02.075671 kernel: pci 0000:00:03.6: PCI bridge to [bus 10] Jan 28 01:14:02.075777 kernel: pci 0000:00:03.6: bridge window [mem 0x82400000-0x825fffff] Jan 28 01:14:02.075873 kernel: pci 0000:00:03.6: bridge window [mem 0x387000000000-0x3877ffffffff 64bit pref] Jan 28 01:14:02.075979 kernel: pci 0000:00:03.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 28 01:14:02.076086 kernel: pci 0000:00:03.7: BAR 0 [mem 0x8438e000-0x8438efff] Jan 28 01:14:02.077250 kernel: pci 0000:00:03.7: PCI bridge to [bus 11] Jan 28 01:14:02.077372 kernel: pci 0000:00:03.7: bridge window [mem 0x82200000-0x823fffff] Jan 28 01:14:02.077472 kernel: pci 0000:00:03.7: bridge window [mem 0x387800000000-0x387fffffffff 64bit pref] Jan 28 01:14:02.077580 kernel: pci 0000:00:04.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 28 01:14:02.077678 kernel: pci 0000:00:04.0: BAR 0 [mem 0x8438d000-0x8438dfff] Jan 28 01:14:02.077781 kernel: pci 0000:00:04.0: PCI bridge to [bus 12] Jan 28 01:14:02.077879 kernel: pci 0000:00:04.0: bridge window [mem 0x82000000-0x821fffff] Jan 28 01:14:02.077976 kernel: pci 0000:00:04.0: bridge window [mem 0x388000000000-0x3887ffffffff 64bit pref] Jan 28 01:14:02.079152 kernel: pci 0000:00:04.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 28 01:14:02.079263 kernel: pci 0000:00:04.1: BAR 0 [mem 0x8438c000-0x8438cfff] Jan 28 01:14:02.079361 kernel: pci 0000:00:04.1: PCI bridge to [bus 13] Jan 28 01:14:02.079462 kernel: pci 0000:00:04.1: bridge window [mem 0x81e00000-0x81ffffff] Jan 28 01:14:02.079563 kernel: pci 0000:00:04.1: bridge window [mem 0x388800000000-0x388fffffffff 64bit pref] Jan 28 01:14:02.079684 kernel: pci 0000:00:04.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 28 01:14:02.079781 kernel: pci 0000:00:04.2: BAR 0 [mem 0x8438b000-0x8438bfff] Jan 28 01:14:02.079877 kernel: pci 0000:00:04.2: PCI bridge to [bus 14] Jan 28 01:14:02.079972 kernel: pci 0000:00:04.2: bridge window [mem 0x81c00000-0x81dfffff] Jan 28 01:14:02.080092 kernel: pci 0000:00:04.2: bridge window [mem 0x389000000000-0x3897ffffffff 64bit pref] Jan 28 01:14:02.080197 kernel: pci 0000:00:04.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 28 01:14:02.080295 kernel: pci 0000:00:04.3: BAR 0 [mem 0x8438a000-0x8438afff] Jan 28 01:14:02.080390 kernel: pci 0000:00:04.3: PCI bridge to [bus 15] Jan 28 01:14:02.080486 kernel: pci 0000:00:04.3: bridge window [mem 0x81a00000-0x81bfffff] Jan 28 01:14:02.080581 kernel: pci 0000:00:04.3: bridge window [mem 0x389800000000-0x389fffffffff 64bit pref] Jan 28 01:14:02.080691 kernel: pci 0000:00:04.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 28 01:14:02.080789 kernel: pci 0000:00:04.4: BAR 0 [mem 0x84389000-0x84389fff] Jan 28 01:14:02.080884 kernel: pci 0000:00:04.4: PCI bridge to [bus 16] Jan 28 01:14:02.080979 kernel: pci 0000:00:04.4: bridge window [mem 0x81800000-0x819fffff] Jan 28 01:14:02.082137 kernel: pci 0000:00:04.4: bridge window [mem 0x38a000000000-0x38a7ffffffff 64bit pref] Jan 28 01:14:02.082252 kernel: pci 0000:00:04.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 28 01:14:02.082354 kernel: pci 0000:00:04.5: BAR 0 [mem 0x84388000-0x84388fff] Jan 28 01:14:02.082450 kernel: pci 0000:00:04.5: PCI bridge to [bus 17] Jan 28 01:14:02.082545 kernel: pci 0000:00:04.5: bridge window [mem 0x81600000-0x817fffff] Jan 28 01:14:02.082641 kernel: pci 0000:00:04.5: bridge window [mem 0x38a800000000-0x38afffffffff 64bit pref] Jan 28 01:14:02.082748 kernel: pci 0000:00:04.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 28 01:14:02.082848 kernel: pci 0000:00:04.6: BAR 0 [mem 0x84387000-0x84387fff] Jan 28 01:14:02.082944 kernel: pci 0000:00:04.6: PCI bridge to [bus 18] Jan 28 01:14:02.083094 kernel: pci 0000:00:04.6: bridge window [mem 0x81400000-0x815fffff] Jan 28 01:14:02.083193 kernel: pci 0000:00:04.6: bridge window [mem 0x38b000000000-0x38b7ffffffff 64bit pref] Jan 28 01:14:02.083296 kernel: pci 0000:00:04.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 28 01:14:02.083398 kernel: pci 0000:00:04.7: BAR 0 [mem 0x84386000-0x84386fff] Jan 28 01:14:02.083495 kernel: pci 0000:00:04.7: PCI bridge to [bus 19] Jan 28 01:14:02.083591 kernel: pci 0000:00:04.7: bridge window [mem 0x81200000-0x813fffff] Jan 28 01:14:02.083698 kernel: pci 0000:00:04.7: bridge window [mem 0x38b800000000-0x38bfffffffff 64bit pref] Jan 28 01:14:02.083808 kernel: pci 0000:00:05.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 28 01:14:02.083908 kernel: pci 0000:00:05.0: BAR 0 [mem 0x84385000-0x84385fff] Jan 28 01:14:02.085026 kernel: pci 0000:00:05.0: PCI bridge to [bus 1a] Jan 28 01:14:02.085163 kernel: pci 0000:00:05.0: bridge window [mem 0x81000000-0x811fffff] Jan 28 01:14:02.085265 kernel: pci 0000:00:05.0: bridge window [mem 0x38c000000000-0x38c7ffffffff 64bit pref] Jan 28 01:14:02.085374 kernel: pci 0000:00:05.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 28 01:14:02.085471 kernel: pci 0000:00:05.1: BAR 0 [mem 0x84384000-0x84384fff] Jan 28 01:14:02.085573 kernel: pci 0000:00:05.1: PCI bridge to [bus 1b] Jan 28 01:14:02.085668 kernel: pci 0000:00:05.1: bridge window [mem 0x80e00000-0x80ffffff] Jan 28 01:14:02.085764 kernel: pci 0000:00:05.1: bridge window [mem 0x38c800000000-0x38cfffffffff 64bit pref] Jan 28 01:14:02.085867 kernel: pci 0000:00:05.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 28 01:14:02.085965 kernel: pci 0000:00:05.2: BAR 0 [mem 0x84383000-0x84383fff] Jan 28 01:14:02.086080 kernel: pci 0000:00:05.2: PCI bridge to [bus 1c] Jan 28 01:14:02.086179 kernel: pci 0000:00:05.2: bridge window [mem 0x80c00000-0x80dfffff] Jan 28 01:14:02.086274 kernel: pci 0000:00:05.2: bridge window [mem 0x38d000000000-0x38d7ffffffff 64bit pref] Jan 28 01:14:02.086382 kernel: pci 0000:00:05.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 28 01:14:02.086478 kernel: pci 0000:00:05.3: BAR 0 [mem 0x84382000-0x84382fff] Jan 28 01:14:02.086574 kernel: pci 0000:00:05.3: PCI bridge to [bus 1d] Jan 28 01:14:02.086670 kernel: pci 0000:00:05.3: bridge window [mem 0x80a00000-0x80bfffff] Jan 28 01:14:02.086771 kernel: pci 0000:00:05.3: bridge window [mem 0x38d800000000-0x38dfffffffff 64bit pref] Jan 28 01:14:02.086875 kernel: pci 0000:00:05.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 28 01:14:02.086972 kernel: pci 0000:00:05.4: BAR 0 [mem 0x84381000-0x84381fff] Jan 28 01:14:02.087463 kernel: pci 0000:00:05.4: PCI bridge to [bus 1e] Jan 28 01:14:02.087563 kernel: pci 0000:00:05.4: bridge window [mem 0x80800000-0x809fffff] Jan 28 01:14:02.087672 kernel: pci 0000:00:05.4: bridge window [mem 0x38e000000000-0x38e7ffffffff 64bit pref] Jan 28 01:14:02.087782 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint Jan 28 01:14:02.087878 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Jan 28 01:14:02.087987 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint Jan 28 01:14:02.088100 kernel: pci 0000:00:1f.2: BAR 4 [io 0x7040-0x705f] Jan 28 01:14:02.088196 kernel: pci 0000:00:1f.2: BAR 5 [mem 0x84380000-0x84380fff] Jan 28 01:14:02.088298 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint Jan 28 01:14:02.088399 kernel: pci 0000:00:1f.3: BAR 4 [io 0x7000-0x703f] Jan 28 01:14:02.088511 kernel: pci 0000:01:00.0: [1b36:000e] type 01 class 0x060400 PCIe to PCI/PCI-X bridge Jan 28 01:14:02.088611 kernel: pci 0000:01:00.0: BAR 0 [mem 0x84200000-0x842000ff 64bit] Jan 28 01:14:02.088712 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Jan 28 01:14:02.088810 kernel: pci 0000:01:00.0: bridge window [io 0x6000-0x6fff] Jan 28 01:14:02.088913 kernel: pci 0000:01:00.0: bridge window [mem 0x84000000-0x841fffff] Jan 28 01:14:02.089028 kernel: pci 0000:01:00.0: bridge window [mem 0x380000000000-0x3807ffffffff 64bit pref] Jan 28 01:14:02.089132 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Jan 28 01:14:02.089241 kernel: pci_bus 0000:02: extended config space not accessible Jan 28 01:14:02.089255 kernel: acpiphp: Slot [1] registered Jan 28 01:14:02.089264 kernel: acpiphp: Slot [0] registered Jan 28 01:14:02.089276 kernel: acpiphp: Slot [2] registered Jan 28 01:14:02.089284 kernel: acpiphp: Slot [3] registered Jan 28 01:14:02.089293 kernel: acpiphp: Slot [4] registered Jan 28 01:14:02.089301 kernel: acpiphp: Slot [5] registered Jan 28 01:14:02.089309 kernel: acpiphp: Slot [6] registered Jan 28 01:14:02.089318 kernel: acpiphp: Slot [7] registered Jan 28 01:14:02.089326 kernel: acpiphp: Slot [8] registered Jan 28 01:14:02.089334 kernel: acpiphp: Slot [9] registered Jan 28 01:14:02.089346 kernel: acpiphp: Slot [10] registered Jan 28 01:14:02.089354 kernel: acpiphp: Slot [11] registered Jan 28 01:14:02.089363 kernel: acpiphp: Slot [12] registered Jan 28 01:14:02.089372 kernel: acpiphp: Slot [13] registered Jan 28 01:14:02.089380 kernel: acpiphp: Slot [14] registered Jan 28 01:14:02.089389 kernel: acpiphp: Slot [15] registered Jan 28 01:14:02.089397 kernel: acpiphp: Slot [16] registered Jan 28 01:14:02.089407 kernel: acpiphp: Slot [17] registered Jan 28 01:14:02.089416 kernel: acpiphp: Slot [18] registered Jan 28 01:14:02.089424 kernel: acpiphp: Slot [19] registered Jan 28 01:14:02.089433 kernel: acpiphp: Slot [20] registered Jan 28 01:14:02.089441 kernel: acpiphp: Slot [21] registered Jan 28 01:14:02.089450 kernel: acpiphp: Slot [22] registered Jan 28 01:14:02.089458 kernel: acpiphp: Slot [23] registered Jan 28 01:14:02.089467 kernel: acpiphp: Slot [24] registered Jan 28 01:14:02.089477 kernel: acpiphp: Slot [25] registered Jan 28 01:14:02.089485 kernel: acpiphp: Slot [26] registered Jan 28 01:14:02.089493 kernel: acpiphp: Slot [27] registered Jan 28 01:14:02.089502 kernel: acpiphp: Slot [28] registered Jan 28 01:14:02.089511 kernel: acpiphp: Slot [29] registered Jan 28 01:14:02.089519 kernel: acpiphp: Slot [30] registered Jan 28 01:14:02.089527 kernel: acpiphp: Slot [31] registered Jan 28 01:14:02.090525 kernel: pci 0000:02:01.0: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint Jan 28 01:14:02.090657 kernel: pci 0000:02:01.0: BAR 4 [io 0x6000-0x601f] Jan 28 01:14:02.090766 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Jan 28 01:14:02.090777 kernel: acpiphp: Slot [0-2] registered Jan 28 01:14:02.090890 kernel: pci 0000:03:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Jan 28 01:14:02.090994 kernel: pci 0000:03:00.0: BAR 1 [mem 0x83e00000-0x83e00fff] Jan 28 01:14:02.091130 kernel: pci 0000:03:00.0: BAR 4 [mem 0x380800000000-0x380800003fff 64bit pref] Jan 28 01:14:02.091231 kernel: pci 0000:03:00.0: ROM [mem 0xfff80000-0xffffffff pref] Jan 28 01:14:02.091335 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Jan 28 01:14:02.091347 kernel: acpiphp: Slot [0-3] registered Jan 28 01:14:02.091457 kernel: pci 0000:04:00.0: [1af4:1042] type 00 class 0x010000 PCIe Endpoint Jan 28 01:14:02.091560 kernel: pci 0000:04:00.0: BAR 1 [mem 0x83c00000-0x83c00fff] Jan 28 01:14:02.091676 kernel: pci 0000:04:00.0: BAR 4 [mem 0x381000000000-0x381000003fff 64bit pref] Jan 28 01:14:02.091777 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Jan 28 01:14:02.091790 kernel: acpiphp: Slot [0-4] registered Jan 28 01:14:02.091903 kernel: pci 0000:05:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Jan 28 01:14:02.092034 kernel: pci 0000:05:00.0: BAR 4 [mem 0x381800000000-0x381800003fff 64bit pref] Jan 28 01:14:02.092141 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Jan 28 01:14:02.092158 kernel: acpiphp: Slot [0-5] registered Jan 28 01:14:02.092271 kernel: pci 0000:06:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Jan 28 01:14:02.092376 kernel: pci 0000:06:00.0: BAR 1 [mem 0x83800000-0x83800fff] Jan 28 01:14:02.092476 kernel: pci 0000:06:00.0: BAR 4 [mem 0x382000000000-0x382000003fff 64bit pref] Jan 28 01:14:02.092577 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Jan 28 01:14:02.092594 kernel: acpiphp: Slot [0-6] registered Jan 28 01:14:02.092698 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Jan 28 01:14:02.092709 kernel: acpiphp: Slot [0-7] registered Jan 28 01:14:02.092811 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Jan 28 01:14:02.092822 kernel: acpiphp: Slot [0-8] registered Jan 28 01:14:02.092922 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Jan 28 01:14:02.092933 kernel: acpiphp: Slot [0-9] registered Jan 28 01:14:02.093054 kernel: pci 0000:00:03.0: PCI bridge to [bus 0a] Jan 28 01:14:02.093067 kernel: acpiphp: Slot [0-10] registered Jan 28 01:14:02.093466 kernel: pci 0000:00:03.1: PCI bridge to [bus 0b] Jan 28 01:14:02.093481 kernel: acpiphp: Slot [0-11] registered Jan 28 01:14:02.093643 kernel: pci 0000:00:03.2: PCI bridge to [bus 0c] Jan 28 01:14:02.093657 kernel: acpiphp: Slot [0-12] registered Jan 28 01:14:02.093761 kernel: pci 0000:00:03.3: PCI bridge to [bus 0d] Jan 28 01:14:02.093777 kernel: acpiphp: Slot [0-13] registered Jan 28 01:14:02.093877 kernel: pci 0000:00:03.4: PCI bridge to [bus 0e] Jan 28 01:14:02.093889 kernel: acpiphp: Slot [0-14] registered Jan 28 01:14:02.093988 kernel: pci 0000:00:03.5: PCI bridge to [bus 0f] Jan 28 01:14:02.093999 kernel: acpiphp: Slot [0-15] registered Jan 28 01:14:02.094798 kernel: pci 0000:00:03.6: PCI bridge to [bus 10] Jan 28 01:14:02.094817 kernel: acpiphp: Slot [0-16] registered Jan 28 01:14:02.094926 kernel: pci 0000:00:03.7: PCI bridge to [bus 11] Jan 28 01:14:02.094947 kernel: acpiphp: Slot [0-17] registered Jan 28 01:14:02.095079 kernel: pci 0000:00:04.0: PCI bridge to [bus 12] Jan 28 01:14:02.095092 kernel: acpiphp: Slot [0-18] registered Jan 28 01:14:02.095193 kernel: pci 0000:00:04.1: PCI bridge to [bus 13] Jan 28 01:14:02.095208 kernel: acpiphp: Slot [0-19] registered Jan 28 01:14:02.095307 kernel: pci 0000:00:04.2: PCI bridge to [bus 14] Jan 28 01:14:02.095319 kernel: acpiphp: Slot [0-20] registered Jan 28 01:14:02.095418 kernel: pci 0000:00:04.3: PCI bridge to [bus 15] Jan 28 01:14:02.095429 kernel: acpiphp: Slot [0-21] registered Jan 28 01:14:02.095526 kernel: pci 0000:00:04.4: PCI bridge to [bus 16] Jan 28 01:14:02.095538 kernel: acpiphp: Slot [0-22] registered Jan 28 01:14:02.095639 kernel: pci 0000:00:04.5: PCI bridge to [bus 17] Jan 28 01:14:02.095661 kernel: acpiphp: Slot [0-23] registered Jan 28 01:14:02.095762 kernel: pci 0000:00:04.6: PCI bridge to [bus 18] Jan 28 01:14:02.095773 kernel: acpiphp: Slot [0-24] registered Jan 28 01:14:02.095873 kernel: pci 0000:00:04.7: PCI bridge to [bus 19] Jan 28 01:14:02.095885 kernel: acpiphp: Slot [0-25] registered Jan 28 01:14:02.095988 kernel: pci 0000:00:05.0: PCI bridge to [bus 1a] Jan 28 01:14:02.096000 kernel: acpiphp: Slot [0-26] registered Jan 28 01:14:02.096106 kernel: pci 0000:00:05.1: PCI bridge to [bus 1b] Jan 28 01:14:02.096118 kernel: acpiphp: Slot [0-27] registered Jan 28 01:14:02.096214 kernel: pci 0000:00:05.2: PCI bridge to [bus 1c] Jan 28 01:14:02.096226 kernel: acpiphp: Slot [0-28] registered Jan 28 01:14:02.096326 kernel: pci 0000:00:05.3: PCI bridge to [bus 1d] Jan 28 01:14:02.096340 kernel: acpiphp: Slot [0-29] registered Jan 28 01:14:02.096439 kernel: pci 0000:00:05.4: PCI bridge to [bus 1e] Jan 28 01:14:02.096451 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Jan 28 01:14:02.096460 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Jan 28 01:14:02.096469 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Jan 28 01:14:02.096477 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Jan 28 01:14:02.096488 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Jan 28 01:14:02.096497 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Jan 28 01:14:02.096506 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Jan 28 01:14:02.096515 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Jan 28 01:14:02.096523 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Jan 28 01:14:02.096532 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Jan 28 01:14:02.096540 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Jan 28 01:14:02.096551 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Jan 28 01:14:02.096559 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Jan 28 01:14:02.096568 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Jan 28 01:14:02.096576 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Jan 28 01:14:02.096585 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Jan 28 01:14:02.096594 kernel: iommu: Default domain type: Translated Jan 28 01:14:02.096602 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Jan 28 01:14:02.096613 kernel: efivars: Registered efivars operations Jan 28 01:14:02.096621 kernel: PCI: Using ACPI for IRQ routing Jan 28 01:14:02.096630 kernel: PCI: pci_cache_line_size set to 64 bytes Jan 28 01:14:02.096639 kernel: e820: reserve RAM buffer [mem 0x0080b000-0x008fffff] Jan 28 01:14:02.096647 kernel: e820: reserve RAM buffer [mem 0x00811000-0x008fffff] Jan 28 01:14:02.096655 kernel: e820: reserve RAM buffer [mem 0x7df57018-0x7fffffff] Jan 28 01:14:02.096664 kernel: e820: reserve RAM buffer [mem 0x7df7f018-0x7fffffff] Jan 28 01:14:02.096672 kernel: e820: reserve RAM buffer [mem 0x7e93f000-0x7fffffff] Jan 28 01:14:02.096682 kernel: e820: reserve RAM buffer [mem 0x7ec71000-0x7fffffff] Jan 28 01:14:02.096691 kernel: e820: reserve RAM buffer [mem 0x7f8ed000-0x7fffffff] Jan 28 01:14:02.096699 kernel: e820: reserve RAM buffer [mem 0x7feaf000-0x7fffffff] Jan 28 01:14:02.096708 kernel: e820: reserve RAM buffer [mem 0x7feec000-0x7fffffff] Jan 28 01:14:02.096810 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Jan 28 01:14:02.096907 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Jan 28 01:14:02.099162 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Jan 28 01:14:02.099185 kernel: vgaarb: loaded Jan 28 01:14:02.099197 kernel: clocksource: Switched to clocksource kvm-clock Jan 28 01:14:02.099208 kernel: VFS: Disk quotas dquot_6.6.0 Jan 28 01:14:02.099218 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Jan 28 01:14:02.099227 kernel: pnp: PnP ACPI init Jan 28 01:14:02.099355 kernel: system 00:04: [mem 0xe0000000-0xefffffff window] has been reserved Jan 28 01:14:02.099374 kernel: pnp: PnP ACPI: found 5 devices Jan 28 01:14:02.099383 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Jan 28 01:14:02.099392 kernel: NET: Registered PF_INET protocol family Jan 28 01:14:02.099401 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Jan 28 01:14:02.099410 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Jan 28 01:14:02.099419 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Jan 28 01:14:02.099428 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Jan 28 01:14:02.099439 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Jan 28 01:14:02.099448 kernel: TCP: Hash tables configured (established 32768 bind 32768) Jan 28 01:14:02.099457 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Jan 28 01:14:02.099466 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Jan 28 01:14:02.099474 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Jan 28 01:14:02.099483 kernel: NET: Registered PF_XDP protocol family Jan 28 01:14:02.099599 kernel: pci 0000:03:00.0: ROM [mem 0xfff80000-0xffffffff pref]: can't claim; no compatible bridge window Jan 28 01:14:02.099719 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Jan 28 01:14:02.099828 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Jan 28 01:14:02.099934 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Jan 28 01:14:02.100053 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Jan 28 01:14:02.100159 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Jan 28 01:14:02.100266 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Jan 28 01:14:02.100374 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Jan 28 01:14:02.100479 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Jan 28 01:14:02.100583 kernel: pci 0000:00:03.1: bridge window [io 0x1000-0x0fff] to [bus 0b] add_size 1000 Jan 28 01:14:02.100687 kernel: pci 0000:00:03.2: bridge window [io 0x1000-0x0fff] to [bus 0c] add_size 1000 Jan 28 01:14:02.100791 kernel: pci 0000:00:03.3: bridge window [io 0x1000-0x0fff] to [bus 0d] add_size 1000 Jan 28 01:14:02.100892 kernel: pci 0000:00:03.4: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Jan 28 01:14:02.100998 kernel: pci 0000:00:03.5: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Jan 28 01:14:02.103260 kernel: pci 0000:00:03.6: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Jan 28 01:14:02.103372 kernel: pci 0000:00:03.7: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Jan 28 01:14:02.103478 kernel: pci 0000:00:04.0: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Jan 28 01:14:02.103581 kernel: pci 0000:00:04.1: bridge window [io 0x1000-0x0fff] to [bus 13] add_size 1000 Jan 28 01:14:02.103696 kernel: pci 0000:00:04.2: bridge window [io 0x1000-0x0fff] to [bus 14] add_size 1000 Jan 28 01:14:02.103801 kernel: pci 0000:00:04.3: bridge window [io 0x1000-0x0fff] to [bus 15] add_size 1000 Jan 28 01:14:02.103910 kernel: pci 0000:00:04.4: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Jan 28 01:14:02.104025 kernel: pci 0000:00:04.5: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Jan 28 01:14:02.104127 kernel: pci 0000:00:04.6: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Jan 28 01:14:02.104228 kernel: pci 0000:00:04.7: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Jan 28 01:14:02.104332 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Jan 28 01:14:02.104434 kernel: pci 0000:00:05.1: bridge window [io 0x1000-0x0fff] to [bus 1b] add_size 1000 Jan 28 01:14:02.104540 kernel: pci 0000:00:05.2: bridge window [io 0x1000-0x0fff] to [bus 1c] add_size 1000 Jan 28 01:14:02.104643 kernel: pci 0000:00:05.3: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Jan 28 01:14:02.104997 kernel: pci 0000:00:05.4: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Jan 28 01:14:02.105130 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x1fff]: assigned Jan 28 01:14:02.105234 kernel: pci 0000:00:02.2: bridge window [io 0x2000-0x2fff]: assigned Jan 28 01:14:02.105339 kernel: pci 0000:00:02.3: bridge window [io 0x3000-0x3fff]: assigned Jan 28 01:14:02.105447 kernel: pci 0000:00:02.4: bridge window [io 0x4000-0x4fff]: assigned Jan 28 01:14:02.105551 kernel: pci 0000:00:02.5: bridge window [io 0x5000-0x5fff]: assigned Jan 28 01:14:02.105654 kernel: pci 0000:00:02.6: bridge window [io 0x8000-0x8fff]: assigned Jan 28 01:14:02.105755 kernel: pci 0000:00:02.7: bridge window [io 0x9000-0x9fff]: assigned Jan 28 01:14:02.105856 kernel: pci 0000:00:03.0: bridge window [io 0xa000-0xafff]: assigned Jan 28 01:14:02.105957 kernel: pci 0000:00:03.1: bridge window [io 0xb000-0xbfff]: assigned Jan 28 01:14:02.106072 kernel: pci 0000:00:03.2: bridge window [io 0xc000-0xcfff]: assigned Jan 28 01:14:02.106178 kernel: pci 0000:00:03.3: bridge window [io 0xd000-0xdfff]: assigned Jan 28 01:14:02.106279 kernel: pci 0000:00:03.4: bridge window [io 0xe000-0xefff]: assigned Jan 28 01:14:02.106380 kernel: pci 0000:00:03.5: bridge window [io 0xf000-0xffff]: assigned Jan 28 01:14:02.106481 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: can't assign; no space Jan 28 01:14:02.106578 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: failed to assign Jan 28 01:14:02.106677 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: can't assign; no space Jan 28 01:14:02.106776 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: failed to assign Jan 28 01:14:02.106874 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: can't assign; no space Jan 28 01:14:02.106970 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: failed to assign Jan 28 01:14:02.107075 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: can't assign; no space Jan 28 01:14:02.107171 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: failed to assign Jan 28 01:14:02.107269 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: can't assign; no space Jan 28 01:14:02.107366 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: failed to assign Jan 28 01:14:02.107468 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: can't assign; no space Jan 28 01:14:02.107564 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: failed to assign Jan 28 01:14:02.107681 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: can't assign; no space Jan 28 01:14:02.107780 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: failed to assign Jan 28 01:14:02.107879 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: can't assign; no space Jan 28 01:14:02.107984 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: failed to assign Jan 28 01:14:02.108102 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: can't assign; no space Jan 28 01:14:02.108212 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: failed to assign Jan 28 01:14:02.108314 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: can't assign; no space Jan 28 01:14:02.108411 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: failed to assign Jan 28 01:14:02.108513 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: can't assign; no space Jan 28 01:14:02.108612 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: failed to assign Jan 28 01:14:02.108715 kernel: pci 0000:00:05.1: bridge window [io size 0x1000]: can't assign; no space Jan 28 01:14:02.108812 kernel: pci 0000:00:05.1: bridge window [io size 0x1000]: failed to assign Jan 28 01:14:02.108915 kernel: pci 0000:00:05.2: bridge window [io size 0x1000]: can't assign; no space Jan 28 01:14:02.109030 kernel: pci 0000:00:05.2: bridge window [io size 0x1000]: failed to assign Jan 28 01:14:02.109132 kernel: pci 0000:00:05.3: bridge window [io size 0x1000]: can't assign; no space Jan 28 01:14:02.109252 kernel: pci 0000:00:05.3: bridge window [io size 0x1000]: failed to assign Jan 28 01:14:02.109355 kernel: pci 0000:00:05.4: bridge window [io size 0x1000]: can't assign; no space Jan 28 01:14:02.109458 kernel: pci 0000:00:05.4: bridge window [io size 0x1000]: failed to assign Jan 28 01:14:02.109556 kernel: pci 0000:00:05.4: bridge window [io 0x1000-0x1fff]: assigned Jan 28 01:14:02.109652 kernel: pci 0000:00:05.3: bridge window [io 0x2000-0x2fff]: assigned Jan 28 01:14:02.109751 kernel: pci 0000:00:05.2: bridge window [io 0x3000-0x3fff]: assigned Jan 28 01:14:02.109867 kernel: pci 0000:00:05.1: bridge window [io 0x4000-0x4fff]: assigned Jan 28 01:14:02.109966 kernel: pci 0000:00:05.0: bridge window [io 0x5000-0x5fff]: assigned Jan 28 01:14:02.110084 kernel: pci 0000:00:04.7: bridge window [io 0x8000-0x8fff]: assigned Jan 28 01:14:02.110181 kernel: pci 0000:00:04.6: bridge window [io 0x9000-0x9fff]: assigned Jan 28 01:14:02.110278 kernel: pci 0000:00:04.5: bridge window [io 0xa000-0xafff]: assigned Jan 28 01:14:02.110373 kernel: pci 0000:00:04.4: bridge window [io 0xb000-0xbfff]: assigned Jan 28 01:14:02.110468 kernel: pci 0000:00:04.3: bridge window [io 0xc000-0xcfff]: assigned Jan 28 01:14:02.110564 kernel: pci 0000:00:04.2: bridge window [io 0xd000-0xdfff]: assigned Jan 28 01:14:02.110659 kernel: pci 0000:00:04.1: bridge window [io 0xe000-0xefff]: assigned Jan 28 01:14:02.110760 kernel: pci 0000:00:04.0: bridge window [io 0xf000-0xffff]: assigned Jan 28 01:14:02.110855 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: can't assign; no space Jan 28 01:14:02.110950 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: failed to assign Jan 28 01:14:02.111060 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: can't assign; no space Jan 28 01:14:02.111157 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: failed to assign Jan 28 01:14:02.111255 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: can't assign; no space Jan 28 01:14:02.111353 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: failed to assign Jan 28 01:14:02.111455 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: can't assign; no space Jan 28 01:14:02.111553 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: failed to assign Jan 28 01:14:02.111660 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: can't assign; no space Jan 28 01:14:02.111761 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: failed to assign Jan 28 01:14:02.111859 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: can't assign; no space Jan 28 01:14:02.111958 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: failed to assign Jan 28 01:14:02.112077 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: can't assign; no space Jan 28 01:14:02.112176 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: failed to assign Jan 28 01:14:02.112276 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: can't assign; no space Jan 28 01:14:02.112413 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: failed to assign Jan 28 01:14:02.112543 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: can't assign; no space Jan 28 01:14:02.112643 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: failed to assign Jan 28 01:14:02.112748 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: can't assign; no space Jan 28 01:14:02.112845 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: failed to assign Jan 28 01:14:02.112945 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: can't assign; no space Jan 28 01:14:02.113052 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: failed to assign Jan 28 01:14:02.113149 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: can't assign; no space Jan 28 01:14:02.113246 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: failed to assign Jan 28 01:14:02.113344 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: can't assign; no space Jan 28 01:14:02.113442 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: failed to assign Jan 28 01:14:02.113542 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: can't assign; no space Jan 28 01:14:02.113640 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: failed to assign Jan 28 01:14:02.113739 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: can't assign; no space Jan 28 01:14:02.113835 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: failed to assign Jan 28 01:14:02.113943 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Jan 28 01:14:02.114065 kernel: pci 0000:01:00.0: bridge window [io 0x6000-0x6fff] Jan 28 01:14:02.114164 kernel: pci 0000:01:00.0: bridge window [mem 0x84000000-0x841fffff] Jan 28 01:14:02.114267 kernel: pci 0000:01:00.0: bridge window [mem 0x380000000000-0x3807ffffffff 64bit pref] Jan 28 01:14:02.114365 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Jan 28 01:14:02.114462 kernel: pci 0000:00:02.0: bridge window [io 0x6000-0x6fff] Jan 28 01:14:02.114560 kernel: pci 0000:00:02.0: bridge window [mem 0x84000000-0x842fffff] Jan 28 01:14:02.114658 kernel: pci 0000:00:02.0: bridge window [mem 0x380000000000-0x3807ffffffff 64bit pref] Jan 28 01:14:02.114765 kernel: pci 0000:03:00.0: ROM [mem 0x83e80000-0x83efffff pref]: assigned Jan 28 01:14:02.114867 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Jan 28 01:14:02.114963 kernel: pci 0000:00:02.1: bridge window [mem 0x83e00000-0x83ffffff] Jan 28 01:14:02.115070 kernel: pci 0000:00:02.1: bridge window [mem 0x380800000000-0x380fffffffff 64bit pref] Jan 28 01:14:02.115170 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Jan 28 01:14:02.115273 kernel: pci 0000:00:02.2: bridge window [mem 0x83c00000-0x83dfffff] Jan 28 01:14:02.115372 kernel: pci 0000:00:02.2: bridge window [mem 0x381000000000-0x3817ffffffff 64bit pref] Jan 28 01:14:02.115470 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Jan 28 01:14:02.115574 kernel: pci 0000:00:02.3: bridge window [mem 0x83a00000-0x83bfffff] Jan 28 01:14:02.115686 kernel: pci 0000:00:02.3: bridge window [mem 0x381800000000-0x381fffffffff 64bit pref] Jan 28 01:14:02.115787 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Jan 28 01:14:02.115890 kernel: pci 0000:00:02.4: bridge window [mem 0x83800000-0x839fffff] Jan 28 01:14:02.115991 kernel: pci 0000:00:02.4: bridge window [mem 0x382000000000-0x3827ffffffff 64bit pref] Jan 28 01:14:02.116113 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Jan 28 01:14:02.116221 kernel: pci 0000:00:02.5: bridge window [mem 0x83600000-0x837fffff] Jan 28 01:14:02.116325 kernel: pci 0000:00:02.5: bridge window [mem 0x382800000000-0x382fffffffff 64bit pref] Jan 28 01:14:02.116433 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Jan 28 01:14:02.116529 kernel: pci 0000:00:02.6: bridge window [mem 0x83400000-0x835fffff] Jan 28 01:14:02.116627 kernel: pci 0000:00:02.6: bridge window [mem 0x383000000000-0x3837ffffffff 64bit pref] Jan 28 01:14:02.116722 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Jan 28 01:14:02.116819 kernel: pci 0000:00:02.7: bridge window [mem 0x83200000-0x833fffff] Jan 28 01:14:02.116919 kernel: pci 0000:00:02.7: bridge window [mem 0x383800000000-0x383fffffffff 64bit pref] Jan 28 01:14:02.117023 kernel: pci 0000:00:03.0: PCI bridge to [bus 0a] Jan 28 01:14:02.117120 kernel: pci 0000:00:03.0: bridge window [mem 0x83000000-0x831fffff] Jan 28 01:14:02.117220 kernel: pci 0000:00:03.0: bridge window [mem 0x384000000000-0x3847ffffffff 64bit pref] Jan 28 01:14:02.117320 kernel: pci 0000:00:03.1: PCI bridge to [bus 0b] Jan 28 01:14:02.117418 kernel: pci 0000:00:03.1: bridge window [mem 0x82e00000-0x82ffffff] Jan 28 01:14:02.117518 kernel: pci 0000:00:03.1: bridge window [mem 0x384800000000-0x384fffffffff 64bit pref] Jan 28 01:14:02.117619 kernel: pci 0000:00:03.2: PCI bridge to [bus 0c] Jan 28 01:14:02.117716 kernel: pci 0000:00:03.2: bridge window [mem 0x82c00000-0x82dfffff] Jan 28 01:14:02.117811 kernel: pci 0000:00:03.2: bridge window [mem 0x385000000000-0x3857ffffffff 64bit pref] Jan 28 01:14:02.117907 kernel: pci 0000:00:03.3: PCI bridge to [bus 0d] Jan 28 01:14:02.118010 kernel: pci 0000:00:03.3: bridge window [mem 0x82a00000-0x82bfffff] Jan 28 01:14:02.118109 kernel: pci 0000:00:03.3: bridge window [mem 0x385800000000-0x385fffffffff 64bit pref] Jan 28 01:14:02.118207 kernel: pci 0000:00:03.4: PCI bridge to [bus 0e] Jan 28 01:14:02.118303 kernel: pci 0000:00:03.4: bridge window [mem 0x82800000-0x829fffff] Jan 28 01:14:02.118401 kernel: pci 0000:00:03.4: bridge window [mem 0x386000000000-0x3867ffffffff 64bit pref] Jan 28 01:14:02.118497 kernel: pci 0000:00:03.5: PCI bridge to [bus 0f] Jan 28 01:14:02.118592 kernel: pci 0000:00:03.5: bridge window [mem 0x82600000-0x827fffff] Jan 28 01:14:02.118686 kernel: pci 0000:00:03.5: bridge window [mem 0x386800000000-0x386fffffffff 64bit pref] Jan 28 01:14:02.118784 kernel: pci 0000:00:03.6: PCI bridge to [bus 10] Jan 28 01:14:02.118879 kernel: pci 0000:00:03.6: bridge window [mem 0x82400000-0x825fffff] Jan 28 01:14:02.118974 kernel: pci 0000:00:03.6: bridge window [mem 0x387000000000-0x3877ffffffff 64bit pref] Jan 28 01:14:02.119080 kernel: pci 0000:00:03.7: PCI bridge to [bus 11] Jan 28 01:14:02.119196 kernel: pci 0000:00:03.7: bridge window [mem 0x82200000-0x823fffff] Jan 28 01:14:02.119298 kernel: pci 0000:00:03.7: bridge window [mem 0x387800000000-0x387fffffffff 64bit pref] Jan 28 01:14:02.119396 kernel: pci 0000:00:04.0: PCI bridge to [bus 12] Jan 28 01:14:02.119492 kernel: pci 0000:00:04.0: bridge window [io 0xf000-0xffff] Jan 28 01:14:02.119588 kernel: pci 0000:00:04.0: bridge window [mem 0x82000000-0x821fffff] Jan 28 01:14:02.119694 kernel: pci 0000:00:04.0: bridge window [mem 0x388000000000-0x3887ffffffff 64bit pref] Jan 28 01:14:02.119791 kernel: pci 0000:00:04.1: PCI bridge to [bus 13] Jan 28 01:14:02.119890 kernel: pci 0000:00:04.1: bridge window [io 0xe000-0xefff] Jan 28 01:14:02.119987 kernel: pci 0000:00:04.1: bridge window [mem 0x81e00000-0x81ffffff] Jan 28 01:14:02.120094 kernel: pci 0000:00:04.1: bridge window [mem 0x388800000000-0x388fffffffff 64bit pref] Jan 28 01:14:02.120192 kernel: pci 0000:00:04.2: PCI bridge to [bus 14] Jan 28 01:14:02.120292 kernel: pci 0000:00:04.2: bridge window [io 0xd000-0xdfff] Jan 28 01:14:02.120389 kernel: pci 0000:00:04.2: bridge window [mem 0x81c00000-0x81dfffff] Jan 28 01:14:02.120487 kernel: pci 0000:00:04.2: bridge window [mem 0x389000000000-0x3897ffffffff 64bit pref] Jan 28 01:14:02.120587 kernel: pci 0000:00:04.3: PCI bridge to [bus 15] Jan 28 01:14:02.120683 kernel: pci 0000:00:04.3: bridge window [io 0xc000-0xcfff] Jan 28 01:14:02.120780 kernel: pci 0000:00:04.3: bridge window [mem 0x81a00000-0x81bfffff] Jan 28 01:14:02.120876 kernel: pci 0000:00:04.3: bridge window [mem 0x389800000000-0x389fffffffff 64bit pref] Jan 28 01:14:02.120980 kernel: pci 0000:00:04.4: PCI bridge to [bus 16] Jan 28 01:14:02.121093 kernel: pci 0000:00:04.4: bridge window [io 0xb000-0xbfff] Jan 28 01:14:02.121190 kernel: pci 0000:00:04.4: bridge window [mem 0x81800000-0x819fffff] Jan 28 01:14:02.121287 kernel: pci 0000:00:04.4: bridge window [mem 0x38a000000000-0x38a7ffffffff 64bit pref] Jan 28 01:14:02.121388 kernel: pci 0000:00:04.5: PCI bridge to [bus 17] Jan 28 01:14:02.121492 kernel: pci 0000:00:04.5: bridge window [io 0xa000-0xafff] Jan 28 01:14:02.121591 kernel: pci 0000:00:04.5: bridge window [mem 0x81600000-0x817fffff] Jan 28 01:14:02.121688 kernel: pci 0000:00:04.5: bridge window [mem 0x38a800000000-0x38afffffffff 64bit pref] Jan 28 01:14:02.121787 kernel: pci 0000:00:04.6: PCI bridge to [bus 18] Jan 28 01:14:02.121884 kernel: pci 0000:00:04.6: bridge window [io 0x9000-0x9fff] Jan 28 01:14:02.121983 kernel: pci 0000:00:04.6: bridge window [mem 0x81400000-0x815fffff] Jan 28 01:14:02.122093 kernel: pci 0000:00:04.6: bridge window [mem 0x38b000000000-0x38b7ffffffff 64bit pref] Jan 28 01:14:02.122199 kernel: pci 0000:00:04.7: PCI bridge to [bus 19] Jan 28 01:14:02.122296 kernel: pci 0000:00:04.7: bridge window [io 0x8000-0x8fff] Jan 28 01:14:02.122394 kernel: pci 0000:00:04.7: bridge window [mem 0x81200000-0x813fffff] Jan 28 01:14:02.122511 kernel: pci 0000:00:04.7: bridge window [mem 0x38b800000000-0x38bfffffffff 64bit pref] Jan 28 01:14:02.122615 kernel: pci 0000:00:05.0: PCI bridge to [bus 1a] Jan 28 01:14:02.122713 kernel: pci 0000:00:05.0: bridge window [io 0x5000-0x5fff] Jan 28 01:14:02.122812 kernel: pci 0000:00:05.0: bridge window [mem 0x81000000-0x811fffff] Jan 28 01:14:02.122908 kernel: pci 0000:00:05.0: bridge window [mem 0x38c000000000-0x38c7ffffffff 64bit pref] Jan 28 01:14:02.123019 kernel: pci 0000:00:05.1: PCI bridge to [bus 1b] Jan 28 01:14:02.123120 kernel: pci 0000:00:05.1: bridge window [io 0x4000-0x4fff] Jan 28 01:14:02.123215 kernel: pci 0000:00:05.1: bridge window [mem 0x80e00000-0x80ffffff] Jan 28 01:14:02.123311 kernel: pci 0000:00:05.1: bridge window [mem 0x38c800000000-0x38cfffffffff 64bit pref] Jan 28 01:14:02.123411 kernel: pci 0000:00:05.2: PCI bridge to [bus 1c] Jan 28 01:14:02.123505 kernel: pci 0000:00:05.2: bridge window [io 0x3000-0x3fff] Jan 28 01:14:02.123599 kernel: pci 0000:00:05.2: bridge window [mem 0x80c00000-0x80dfffff] Jan 28 01:14:02.123703 kernel: pci 0000:00:05.2: bridge window [mem 0x38d000000000-0x38d7ffffffff 64bit pref] Jan 28 01:14:02.123801 kernel: pci 0000:00:05.3: PCI bridge to [bus 1d] Jan 28 01:14:02.123900 kernel: pci 0000:00:05.3: bridge window [io 0x2000-0x2fff] Jan 28 01:14:02.124001 kernel: pci 0000:00:05.3: bridge window [mem 0x80a00000-0x80bfffff] Jan 28 01:14:02.124137 kernel: pci 0000:00:05.3: bridge window [mem 0x38d800000000-0x38dfffffffff 64bit pref] Jan 28 01:14:02.124237 kernel: pci 0000:00:05.4: PCI bridge to [bus 1e] Jan 28 01:14:02.124334 kernel: pci 0000:00:05.4: bridge window [io 0x1000-0x1fff] Jan 28 01:14:02.124429 kernel: pci 0000:00:05.4: bridge window [mem 0x80800000-0x809fffff] Jan 28 01:14:02.124525 kernel: pci 0000:00:05.4: bridge window [mem 0x38e000000000-0x38e7ffffffff 64bit pref] Jan 28 01:14:02.124630 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Jan 28 01:14:02.124723 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Jan 28 01:14:02.124813 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Jan 28 01:14:02.124901 kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xdfffffff window] Jan 28 01:14:02.124988 kernel: pci_bus 0000:00: resource 8 [mem 0xf0000000-0xfebfffff window] Jan 28 01:14:02.125083 kernel: pci_bus 0000:00: resource 9 [mem 0x380000000000-0x38e800003fff window] Jan 28 01:14:02.125191 kernel: pci_bus 0000:01: resource 0 [io 0x6000-0x6fff] Jan 28 01:14:02.125283 kernel: pci_bus 0000:01: resource 1 [mem 0x84000000-0x842fffff] Jan 28 01:14:02.125373 kernel: pci_bus 0000:01: resource 2 [mem 0x380000000000-0x3807ffffffff 64bit pref] Jan 28 01:14:02.125472 kernel: pci_bus 0000:02: resource 0 [io 0x6000-0x6fff] Jan 28 01:14:02.125567 kernel: pci_bus 0000:02: resource 1 [mem 0x84000000-0x841fffff] Jan 28 01:14:02.125660 kernel: pci_bus 0000:02: resource 2 [mem 0x380000000000-0x3807ffffffff 64bit pref] Jan 28 01:14:02.125762 kernel: pci_bus 0000:03: resource 1 [mem 0x83e00000-0x83ffffff] Jan 28 01:14:02.125872 kernel: pci_bus 0000:03: resource 2 [mem 0x380800000000-0x380fffffffff 64bit pref] Jan 28 01:14:02.125976 kernel: pci_bus 0000:04: resource 1 [mem 0x83c00000-0x83dfffff] Jan 28 01:14:02.126082 kernel: pci_bus 0000:04: resource 2 [mem 0x381000000000-0x3817ffffffff 64bit pref] Jan 28 01:14:02.126182 kernel: pci_bus 0000:05: resource 1 [mem 0x83a00000-0x83bfffff] Jan 28 01:14:02.126278 kernel: pci_bus 0000:05: resource 2 [mem 0x381800000000-0x381fffffffff 64bit pref] Jan 28 01:14:02.126376 kernel: pci_bus 0000:06: resource 1 [mem 0x83800000-0x839fffff] Jan 28 01:14:02.126467 kernel: pci_bus 0000:06: resource 2 [mem 0x382000000000-0x3827ffffffff 64bit pref] Jan 28 01:14:02.126565 kernel: pci_bus 0000:07: resource 1 [mem 0x83600000-0x837fffff] Jan 28 01:14:02.126655 kernel: pci_bus 0000:07: resource 2 [mem 0x382800000000-0x382fffffffff 64bit pref] Jan 28 01:14:02.126757 kernel: pci_bus 0000:08: resource 1 [mem 0x83400000-0x835fffff] Jan 28 01:14:02.126847 kernel: pci_bus 0000:08: resource 2 [mem 0x383000000000-0x3837ffffffff 64bit pref] Jan 28 01:14:02.126946 kernel: pci_bus 0000:09: resource 1 [mem 0x83200000-0x833fffff] Jan 28 01:14:02.127062 kernel: pci_bus 0000:09: resource 2 [mem 0x383800000000-0x383fffffffff 64bit pref] Jan 28 01:14:02.127161 kernel: pci_bus 0000:0a: resource 1 [mem 0x83000000-0x831fffff] Jan 28 01:14:02.127256 kernel: pci_bus 0000:0a: resource 2 [mem 0x384000000000-0x3847ffffffff 64bit pref] Jan 28 01:14:02.128462 kernel: pci_bus 0000:0b: resource 1 [mem 0x82e00000-0x82ffffff] Jan 28 01:14:02.128598 kernel: pci_bus 0000:0b: resource 2 [mem 0x384800000000-0x384fffffffff 64bit pref] Jan 28 01:14:02.128705 kernel: pci_bus 0000:0c: resource 1 [mem 0x82c00000-0x82dfffff] Jan 28 01:14:02.128798 kernel: pci_bus 0000:0c: resource 2 [mem 0x385000000000-0x3857ffffffff 64bit pref] Jan 28 01:14:02.128909 kernel: pci_bus 0000:0d: resource 1 [mem 0x82a00000-0x82bfffff] Jan 28 01:14:02.129000 kernel: pci_bus 0000:0d: resource 2 [mem 0x385800000000-0x385fffffffff 64bit pref] Jan 28 01:14:02.129111 kernel: pci_bus 0000:0e: resource 1 [mem 0x82800000-0x829fffff] Jan 28 01:14:02.129202 kernel: pci_bus 0000:0e: resource 2 [mem 0x386000000000-0x3867ffffffff 64bit pref] Jan 28 01:14:02.129302 kernel: pci_bus 0000:0f: resource 1 [mem 0x82600000-0x827fffff] Jan 28 01:14:02.129399 kernel: pci_bus 0000:0f: resource 2 [mem 0x386800000000-0x386fffffffff 64bit pref] Jan 28 01:14:02.129502 kernel: pci_bus 0000:10: resource 1 [mem 0x82400000-0x825fffff] Jan 28 01:14:02.129594 kernel: pci_bus 0000:10: resource 2 [mem 0x387000000000-0x3877ffffffff 64bit pref] Jan 28 01:14:02.129699 kernel: pci_bus 0000:11: resource 1 [mem 0x82200000-0x823fffff] Jan 28 01:14:02.129789 kernel: pci_bus 0000:11: resource 2 [mem 0x387800000000-0x387fffffffff 64bit pref] Jan 28 01:14:02.129889 kernel: pci_bus 0000:12: resource 0 [io 0xf000-0xffff] Jan 28 01:14:02.129980 kernel: pci_bus 0000:12: resource 1 [mem 0x82000000-0x821fffff] Jan 28 01:14:02.130079 kernel: pci_bus 0000:12: resource 2 [mem 0x388000000000-0x3887ffffffff 64bit pref] Jan 28 01:14:02.130178 kernel: pci_bus 0000:13: resource 0 [io 0xe000-0xefff] Jan 28 01:14:02.130269 kernel: pci_bus 0000:13: resource 1 [mem 0x81e00000-0x81ffffff] Jan 28 01:14:02.130359 kernel: pci_bus 0000:13: resource 2 [mem 0x388800000000-0x388fffffffff 64bit pref] Jan 28 01:14:02.130459 kernel: pci_bus 0000:14: resource 0 [io 0xd000-0xdfff] Jan 28 01:14:02.130553 kernel: pci_bus 0000:14: resource 1 [mem 0x81c00000-0x81dfffff] Jan 28 01:14:02.130643 kernel: pci_bus 0000:14: resource 2 [mem 0x389000000000-0x3897ffffffff 64bit pref] Jan 28 01:14:02.130743 kernel: pci_bus 0000:15: resource 0 [io 0xc000-0xcfff] Jan 28 01:14:02.130833 kernel: pci_bus 0000:15: resource 1 [mem 0x81a00000-0x81bfffff] Jan 28 01:14:02.130926 kernel: pci_bus 0000:15: resource 2 [mem 0x389800000000-0x389fffffffff 64bit pref] Jan 28 01:14:02.132113 kernel: pci_bus 0000:16: resource 0 [io 0xb000-0xbfff] Jan 28 01:14:02.132226 kernel: pci_bus 0000:16: resource 1 [mem 0x81800000-0x819fffff] Jan 28 01:14:02.132318 kernel: pci_bus 0000:16: resource 2 [mem 0x38a000000000-0x38a7ffffffff 64bit pref] Jan 28 01:14:02.132419 kernel: pci_bus 0000:17: resource 0 [io 0xa000-0xafff] Jan 28 01:14:02.132509 kernel: pci_bus 0000:17: resource 1 [mem 0x81600000-0x817fffff] Jan 28 01:14:02.132612 kernel: pci_bus 0000:17: resource 2 [mem 0x38a800000000-0x38afffffffff 64bit pref] Jan 28 01:14:02.132713 kernel: pci_bus 0000:18: resource 0 [io 0x9000-0x9fff] Jan 28 01:14:02.132803 kernel: pci_bus 0000:18: resource 1 [mem 0x81400000-0x815fffff] Jan 28 01:14:02.132892 kernel: pci_bus 0000:18: resource 2 [mem 0x38b000000000-0x38b7ffffffff 64bit pref] Jan 28 01:14:02.132990 kernel: pci_bus 0000:19: resource 0 [io 0x8000-0x8fff] Jan 28 01:14:02.133879 kernel: pci_bus 0000:19: resource 1 [mem 0x81200000-0x813fffff] Jan 28 01:14:02.133977 kernel: pci_bus 0000:19: resource 2 [mem 0x38b800000000-0x38bfffffffff 64bit pref] Jan 28 01:14:02.134093 kernel: pci_bus 0000:1a: resource 0 [io 0x5000-0x5fff] Jan 28 01:14:02.134185 kernel: pci_bus 0000:1a: resource 1 [mem 0x81000000-0x811fffff] Jan 28 01:14:02.134275 kernel: pci_bus 0000:1a: resource 2 [mem 0x38c000000000-0x38c7ffffffff 64bit pref] Jan 28 01:14:02.134374 kernel: pci_bus 0000:1b: resource 0 [io 0x4000-0x4fff] Jan 28 01:14:02.134467 kernel: pci_bus 0000:1b: resource 1 [mem 0x80e00000-0x80ffffff] Jan 28 01:14:02.134557 kernel: pci_bus 0000:1b: resource 2 [mem 0x38c800000000-0x38cfffffffff 64bit pref] Jan 28 01:14:02.134655 kernel: pci_bus 0000:1c: resource 0 [io 0x3000-0x3fff] Jan 28 01:14:02.134747 kernel: pci_bus 0000:1c: resource 1 [mem 0x80c00000-0x80dfffff] Jan 28 01:14:02.134837 kernel: pci_bus 0000:1c: resource 2 [mem 0x38d000000000-0x38d7ffffffff 64bit pref] Jan 28 01:14:02.134937 kernel: pci_bus 0000:1d: resource 0 [io 0x2000-0x2fff] Jan 28 01:14:02.136068 kernel: pci_bus 0000:1d: resource 1 [mem 0x80a00000-0x80bfffff] Jan 28 01:14:02.136171 kernel: pci_bus 0000:1d: resource 2 [mem 0x38d800000000-0x38dfffffffff 64bit pref] Jan 28 01:14:02.136275 kernel: pci_bus 0000:1e: resource 0 [io 0x1000-0x1fff] Jan 28 01:14:02.136366 kernel: pci_bus 0000:1e: resource 1 [mem 0x80800000-0x809fffff] Jan 28 01:14:02.136454 kernel: pci_bus 0000:1e: resource 2 [mem 0x38e000000000-0x38e7ffffffff 64bit pref] Jan 28 01:14:02.136467 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Jan 28 01:14:02.136480 kernel: PCI: CLS 0 bytes, default 64 Jan 28 01:14:02.136489 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Jan 28 01:14:02.136498 kernel: software IO TLB: mapped [mem 0x0000000077ede000-0x000000007bede000] (64MB) Jan 28 01:14:02.136506 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Jan 28 01:14:02.136515 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x21134f58f0d, max_idle_ns: 440795217993 ns Jan 28 01:14:02.136523 kernel: Initialise system trusted keyrings Jan 28 01:14:02.136533 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Jan 28 01:14:02.136544 kernel: Key type asymmetric registered Jan 28 01:14:02.136552 kernel: Asymmetric key parser 'x509' registered Jan 28 01:14:02.136561 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Jan 28 01:14:02.136569 kernel: io scheduler mq-deadline registered Jan 28 01:14:02.136578 kernel: io scheduler kyber registered Jan 28 01:14:02.136586 kernel: io scheduler bfq registered Jan 28 01:14:02.136697 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Jan 28 01:14:02.136803 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Jan 28 01:14:02.136907 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Jan 28 01:14:02.137027 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Jan 28 01:14:02.137132 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Jan 28 01:14:02.137230 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Jan 28 01:14:02.137334 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Jan 28 01:14:02.137432 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Jan 28 01:14:02.137531 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Jan 28 01:14:02.137628 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Jan 28 01:14:02.137728 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Jan 28 01:14:02.137827 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Jan 28 01:14:02.137929 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Jan 28 01:14:02.138538 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Jan 28 01:14:02.138657 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Jan 28 01:14:02.138759 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Jan 28 01:14:02.138775 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Jan 28 01:14:02.138876 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 32 Jan 28 01:14:02.138976 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 32 Jan 28 01:14:02.139270 kernel: pcieport 0000:00:03.1: PME: Signaling with IRQ 33 Jan 28 01:14:02.139373 kernel: pcieport 0000:00:03.1: AER: enabled with IRQ 33 Jan 28 01:14:02.139480 kernel: pcieport 0000:00:03.2: PME: Signaling with IRQ 34 Jan 28 01:14:02.139578 kernel: pcieport 0000:00:03.2: AER: enabled with IRQ 34 Jan 28 01:14:02.139696 kernel: pcieport 0000:00:03.3: PME: Signaling with IRQ 35 Jan 28 01:14:02.139793 kernel: pcieport 0000:00:03.3: AER: enabled with IRQ 35 Jan 28 01:14:02.139892 kernel: pcieport 0000:00:03.4: PME: Signaling with IRQ 36 Jan 28 01:14:02.139993 kernel: pcieport 0000:00:03.4: AER: enabled with IRQ 36 Jan 28 01:14:02.140107 kernel: pcieport 0000:00:03.5: PME: Signaling with IRQ 37 Jan 28 01:14:02.140205 kernel: pcieport 0000:00:03.5: AER: enabled with IRQ 37 Jan 28 01:14:02.140306 kernel: pcieport 0000:00:03.6: PME: Signaling with IRQ 38 Jan 28 01:14:02.140403 kernel: pcieport 0000:00:03.6: AER: enabled with IRQ 38 Jan 28 01:14:02.140506 kernel: pcieport 0000:00:03.7: PME: Signaling with IRQ 39 Jan 28 01:14:02.140603 kernel: pcieport 0000:00:03.7: AER: enabled with IRQ 39 Jan 28 01:14:02.140614 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 Jan 28 01:14:02.140712 kernel: pcieport 0000:00:04.0: PME: Signaling with IRQ 40 Jan 28 01:14:02.140809 kernel: pcieport 0000:00:04.0: AER: enabled with IRQ 40 Jan 28 01:14:02.140910 kernel: pcieport 0000:00:04.1: PME: Signaling with IRQ 41 Jan 28 01:14:02.141124 kernel: pcieport 0000:00:04.1: AER: enabled with IRQ 41 Jan 28 01:14:02.141235 kernel: pcieport 0000:00:04.2: PME: Signaling with IRQ 42 Jan 28 01:14:02.141332 kernel: pcieport 0000:00:04.2: AER: enabled with IRQ 42 Jan 28 01:14:02.141433 kernel: pcieport 0000:00:04.3: PME: Signaling with IRQ 43 Jan 28 01:14:02.141530 kernel: pcieport 0000:00:04.3: AER: enabled with IRQ 43 Jan 28 01:14:02.141633 kernel: pcieport 0000:00:04.4: PME: Signaling with IRQ 44 Jan 28 01:14:02.141731 kernel: pcieport 0000:00:04.4: AER: enabled with IRQ 44 Jan 28 01:14:02.141838 kernel: pcieport 0000:00:04.5: PME: Signaling with IRQ 45 Jan 28 01:14:02.141935 kernel: pcieport 0000:00:04.5: AER: enabled with IRQ 45 Jan 28 01:14:02.142045 kernel: pcieport 0000:00:04.6: PME: Signaling with IRQ 46 Jan 28 01:14:02.142146 kernel: pcieport 0000:00:04.6: AER: enabled with IRQ 46 Jan 28 01:14:02.142246 kernel: pcieport 0000:00:04.7: PME: Signaling with IRQ 47 Jan 28 01:14:02.142342 kernel: pcieport 0000:00:04.7: AER: enabled with IRQ 47 Jan 28 01:14:02.142353 kernel: ACPI: \_SB_.GSIF: Enabled at IRQ 21 Jan 28 01:14:02.142456 kernel: pcieport 0000:00:05.0: PME: Signaling with IRQ 48 Jan 28 01:14:02.142554 kernel: pcieport 0000:00:05.0: AER: enabled with IRQ 48 Jan 28 01:14:02.142656 kernel: pcieport 0000:00:05.1: PME: Signaling with IRQ 49 Jan 28 01:14:02.142755 kernel: pcieport 0000:00:05.1: AER: enabled with IRQ 49 Jan 28 01:14:02.142856 kernel: pcieport 0000:00:05.2: PME: Signaling with IRQ 50 Jan 28 01:14:02.142954 kernel: pcieport 0000:00:05.2: AER: enabled with IRQ 50 Jan 28 01:14:02.143075 kernel: pcieport 0000:00:05.3: PME: Signaling with IRQ 51 Jan 28 01:14:02.143175 kernel: pcieport 0000:00:05.3: AER: enabled with IRQ 51 Jan 28 01:14:02.143276 kernel: pcieport 0000:00:05.4: PME: Signaling with IRQ 52 Jan 28 01:14:02.143375 kernel: pcieport 0000:00:05.4: AER: enabled with IRQ 52 Jan 28 01:14:02.143386 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Jan 28 01:14:02.143395 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Jan 28 01:14:02.143405 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Jan 28 01:14:02.143416 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Jan 28 01:14:02.143425 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Jan 28 01:14:02.143433 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Jan 28 01:14:02.143442 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Jan 28 01:14:02.143555 kernel: rtc_cmos 00:03: RTC can wake from S4 Jan 28 01:14:02.143661 kernel: rtc_cmos 00:03: registered as rtc0 Jan 28 01:14:02.143756 kernel: rtc_cmos 00:03: setting system clock to 2026-01-28T01:14:00 UTC (1769562840) Jan 28 01:14:02.143852 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram Jan 28 01:14:02.143862 kernel: intel_pstate: CPU model not supported Jan 28 01:14:02.143871 kernel: efifb: probing for efifb Jan 28 01:14:02.143880 kernel: efifb: framebuffer at 0x80000000, using 4000k, total 4000k Jan 28 01:14:02.143889 kernel: efifb: mode is 1280x800x32, linelength=5120, pages=1 Jan 28 01:14:02.143899 kernel: efifb: scrolling: redraw Jan 28 01:14:02.143910 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Jan 28 01:14:02.143920 kernel: Console: switching to colour frame buffer device 160x50 Jan 28 01:14:02.143928 kernel: fb0: EFI VGA frame buffer device Jan 28 01:14:02.143937 kernel: pstore: Using crash dump compression: deflate Jan 28 01:14:02.143945 kernel: pstore: Registered efi_pstore as persistent store backend Jan 28 01:14:02.143954 kernel: NET: Registered PF_INET6 protocol family Jan 28 01:14:02.143963 kernel: Segment Routing with IPv6 Jan 28 01:14:02.143971 kernel: In-situ OAM (IOAM) with IPv6 Jan 28 01:14:02.143982 kernel: NET: Registered PF_PACKET protocol family Jan 28 01:14:02.143990 kernel: Key type dns_resolver registered Jan 28 01:14:02.144000 kernel: IPI shorthand broadcast: enabled Jan 28 01:14:02.144023 kernel: sched_clock: Marking stable (2415001996, 156692192)->(2904717208, -333023020) Jan 28 01:14:02.144038 kernel: registered taskstats version 1 Jan 28 01:14:02.144047 kernel: Loading compiled-in X.509 certificates Jan 28 01:14:02.144055 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.66-flatcar: 0eb3c2aae9988d4ab7f0e142c4f5c61453c9ddb3' Jan 28 01:14:02.144066 kernel: Demotion targets for Node 0: null Jan 28 01:14:02.144075 kernel: Key type .fscrypt registered Jan 28 01:14:02.144084 kernel: Key type fscrypt-provisioning registered Jan 28 01:14:02.144093 kernel: ima: No TPM chip found, activating TPM-bypass! Jan 28 01:14:02.144102 kernel: ima: Allocated hash algorithm: sha1 Jan 28 01:14:02.144110 kernel: ima: No architecture policies found Jan 28 01:14:02.144119 kernel: clk: Disabling unused clocks Jan 28 01:14:02.144130 kernel: Freeing unused kernel image (initmem) memory: 15536K Jan 28 01:14:02.144139 kernel: Write protecting the kernel read-only data: 47104k Jan 28 01:14:02.144148 kernel: Freeing unused kernel image (rodata/data gap) memory: 1124K Jan 28 01:14:02.144157 kernel: Run /init as init process Jan 28 01:14:02.144166 kernel: with arguments: Jan 28 01:14:02.144175 kernel: /init Jan 28 01:14:02.144183 kernel: with environment: Jan 28 01:14:02.144191 kernel: HOME=/ Jan 28 01:14:02.144202 kernel: TERM=linux Jan 28 01:14:02.144211 kernel: SCSI subsystem initialized Jan 28 01:14:02.144220 kernel: libata version 3.00 loaded. Jan 28 01:14:02.144334 kernel: ahci 0000:00:1f.2: version 3.0 Jan 28 01:14:02.144346 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Jan 28 01:14:02.144444 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode Jan 28 01:14:02.144546 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) Jan 28 01:14:02.144645 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Jan 28 01:14:02.144767 kernel: scsi host0: ahci Jan 28 01:14:02.144881 kernel: scsi host1: ahci Jan 28 01:14:02.145018 kernel: scsi host2: ahci Jan 28 01:14:02.145227 kernel: scsi host3: ahci Jan 28 01:14:02.145337 kernel: scsi host4: ahci Jan 28 01:14:02.145442 kernel: scsi host5: ahci Jan 28 01:14:02.145454 kernel: ata1: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380100 irq 55 lpm-pol 1 Jan 28 01:14:02.145463 kernel: ata2: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380180 irq 55 lpm-pol 1 Jan 28 01:14:02.145472 kernel: ata3: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380200 irq 55 lpm-pol 1 Jan 28 01:14:02.145481 kernel: ata4: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380280 irq 55 lpm-pol 1 Jan 28 01:14:02.145493 kernel: ata5: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380300 irq 55 lpm-pol 1 Jan 28 01:14:02.145501 kernel: ata6: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380380 irq 55 lpm-pol 1 Jan 28 01:14:02.145510 kernel: ata3: SATA link down (SStatus 0 SControl 300) Jan 28 01:14:02.145519 kernel: ata5: SATA link down (SStatus 0 SControl 300) Jan 28 01:14:02.145528 kernel: ata1: SATA link down (SStatus 0 SControl 300) Jan 28 01:14:02.145536 kernel: ata2: SATA link down (SStatus 0 SControl 300) Jan 28 01:14:02.145545 kernel: ata6: SATA link down (SStatus 0 SControl 300) Jan 28 01:14:02.145556 kernel: ata4: SATA link down (SStatus 0 SControl 300) Jan 28 01:14:02.145565 kernel: ACPI: bus type USB registered Jan 28 01:14:02.145574 kernel: usbcore: registered new interface driver usbfs Jan 28 01:14:02.145583 kernel: usbcore: registered new interface driver hub Jan 28 01:14:02.145592 kernel: usbcore: registered new device driver usb Jan 28 01:14:02.145704 kernel: uhci_hcd 0000:02:01.0: UHCI Host Controller Jan 28 01:14:02.145809 kernel: uhci_hcd 0000:02:01.0: new USB bus registered, assigned bus number 1 Jan 28 01:14:02.145919 kernel: uhci_hcd 0000:02:01.0: detected 2 ports Jan 28 01:14:02.146034 kernel: uhci_hcd 0000:02:01.0: irq 22, io port 0x00006000 Jan 28 01:14:02.146750 kernel: hub 1-0:1.0: USB hub found Jan 28 01:14:02.146865 kernel: hub 1-0:1.0: 2 ports detected Jan 28 01:14:02.146980 kernel: virtio_blk virtio2: 2/0/0 default/read/poll queues Jan 28 01:14:02.147131 kernel: virtio_blk virtio2: [vda] 104857600 512-byte logical blocks (53.7 GB/50.0 GiB) Jan 28 01:14:02.147144 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Jan 28 01:14:02.147153 kernel: GPT:25804799 != 104857599 Jan 28 01:14:02.147162 kernel: GPT:Alternate GPT header not at the end of the disk. Jan 28 01:14:02.147171 kernel: GPT:25804799 != 104857599 Jan 28 01:14:02.147180 kernel: GPT: Use GNU Parted to correct GPT errors. Jan 28 01:14:02.147189 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Jan 28 01:14:02.147202 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Jan 28 01:14:02.147211 kernel: device-mapper: uevent: version 1.0.3 Jan 28 01:14:02.147220 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Jan 28 01:14:02.147229 kernel: device-mapper: verity: sha256 using shash "sha256-generic" Jan 28 01:14:02.147238 kernel: raid6: avx512x4 gen() 42845 MB/s Jan 28 01:14:02.147247 kernel: raid6: avx512x2 gen() 45629 MB/s Jan 28 01:14:02.147255 kernel: raid6: avx512x1 gen() 44390 MB/s Jan 28 01:14:02.147266 kernel: raid6: avx2x4 gen() 34226 MB/s Jan 28 01:14:02.147275 kernel: raid6: avx2x2 gen() 33604 MB/s Jan 28 01:14:02.147284 kernel: raid6: avx2x1 gen() 30493 MB/s Jan 28 01:14:02.147293 kernel: raid6: using algorithm avx512x2 gen() 45629 MB/s Jan 28 01:14:02.147302 kernel: raid6: .... xor() 26721 MB/s, rmw enabled Jan 28 01:14:02.147313 kernel: raid6: using avx512x2 recovery algorithm Jan 28 01:14:02.147324 kernel: xor: automatically using best checksumming function avx Jan 28 01:14:02.147454 kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd Jan 28 01:14:02.147467 kernel: Btrfs loaded, zoned=no, fsverity=no Jan 28 01:14:02.147476 kernel: BTRFS: device fsid 0f5fa021-4357-40bb-b32a-e1579c5824ad devid 1 transid 39 /dev/mapper/usr (253:0) scanned by mount (203) Jan 28 01:14:02.147485 kernel: BTRFS info (device dm-0): first mount of filesystem 0f5fa021-4357-40bb-b32a-e1579c5824ad Jan 28 01:14:02.147494 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Jan 28 01:14:02.147507 kernel: BTRFS info (device dm-0): disabling log replay at mount time Jan 28 01:14:02.147516 kernel: BTRFS info (device dm-0): enabling free space tree Jan 28 01:14:02.147525 kernel: loop: module loaded Jan 28 01:14:02.147534 kernel: loop0: detected capacity change from 0 to 100552 Jan 28 01:14:02.147543 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Jan 28 01:14:02.147553 systemd[1]: Successfully made /usr/ read-only. Jan 28 01:14:02.147566 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jan 28 01:14:02.147578 systemd[1]: Detected virtualization kvm. Jan 28 01:14:02.147587 systemd[1]: Detected architecture x86-64. Jan 28 01:14:02.147596 systemd[1]: Running in initrd. Jan 28 01:14:02.147605 systemd[1]: No hostname configured, using default hostname. Jan 28 01:14:02.147615 systemd[1]: Hostname set to . Jan 28 01:14:02.147624 systemd[1]: Initializing machine ID from SMBIOS/DMI UUID. Jan 28 01:14:02.147634 systemd[1]: Queued start job for default target initrd.target. Jan 28 01:14:02.147643 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Jan 28 01:14:02.147662 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 28 01:14:02.147671 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 28 01:14:02.147681 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Jan 28 01:14:02.147691 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 28 01:14:02.147702 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Jan 28 01:14:02.147712 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Jan 28 01:14:02.147721 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 28 01:14:02.147730 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 28 01:14:02.147739 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Jan 28 01:14:02.147749 systemd[1]: Reached target paths.target - Path Units. Jan 28 01:14:02.147761 systemd[1]: Reached target slices.target - Slice Units. Jan 28 01:14:02.147770 systemd[1]: Reached target swap.target - Swaps. Jan 28 01:14:02.147780 systemd[1]: Reached target timers.target - Timer Units. Jan 28 01:14:02.147789 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Jan 28 01:14:02.147798 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 28 01:14:02.147807 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Jan 28 01:14:02.147816 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Jan 28 01:14:02.147827 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Jan 28 01:14:02.147837 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 28 01:14:02.147846 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 28 01:14:02.147855 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 28 01:14:02.147865 systemd[1]: Reached target sockets.target - Socket Units. Jan 28 01:14:02.147874 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Jan 28 01:14:02.147884 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Jan 28 01:14:02.147896 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 28 01:14:02.147905 systemd[1]: Finished network-cleanup.service - Network Cleanup. Jan 28 01:14:02.147914 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Jan 28 01:14:02.147924 systemd[1]: Starting systemd-fsck-usr.service... Jan 28 01:14:02.147933 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 28 01:14:02.147942 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 28 01:14:02.147953 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 28 01:14:02.147963 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Jan 28 01:14:02.147972 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 28 01:14:02.147982 systemd[1]: Finished systemd-fsck-usr.service. Jan 28 01:14:02.148027 systemd-journald[340]: Collecting audit messages is enabled. Jan 28 01:14:02.148051 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 28 01:14:02.148061 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Jan 28 01:14:02.148072 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 28 01:14:02.148082 kernel: Bridge firewalling registered Jan 28 01:14:02.148091 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 28 01:14:02.148101 kernel: audit: type=1130 audit(1769562842.113:2): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:02.148111 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 28 01:14:02.148120 kernel: audit: type=1130 audit(1769562842.120:3): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:02.148131 kernel: audit: type=1130 audit(1769562842.126:4): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:02.148141 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 28 01:14:02.148150 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 28 01:14:02.148159 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 28 01:14:02.148171 systemd-journald[340]: Journal started Jan 28 01:14:02.148193 systemd-journald[340]: Runtime Journal (/run/log/journal/f1bbc2cc1d7f4404a4a0f9def3d0e3be) is 8M, max 77.9M, 69.9M free. Jan 28 01:14:02.113000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:02.120000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:02.126000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:02.119266 systemd-modules-load[342]: Inserted module 'br_netfilter' Jan 28 01:14:02.152493 systemd[1]: Started systemd-journald.service - Journal Service. Jan 28 01:14:02.152000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:02.159029 kernel: audit: type=1130 audit(1769562842.152:5): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:02.162361 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 28 01:14:02.164326 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 28 01:14:02.171561 kernel: audit: type=1130 audit(1769562842.166:6): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:02.166000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:02.166363 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 28 01:14:02.176619 kernel: audit: type=1130 audit(1769562842.172:7): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:02.172000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:02.177000 audit: BPF prog-id=6 op=LOAD Jan 28 01:14:02.179025 kernel: audit: type=1334 audit(1769562842.177:8): prog-id=6 op=LOAD Jan 28 01:14:02.180644 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 28 01:14:02.186566 kernel: audit: type=1130 audit(1769562842.182:9): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:02.182000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:02.181577 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 28 01:14:02.188832 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Jan 28 01:14:02.191537 systemd-tmpfiles[365]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Jan 28 01:14:02.196264 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 28 01:14:02.201554 kernel: audit: type=1130 audit(1769562842.196:10): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:02.196000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:02.208420 dracut-cmdline[378]: dracut-109 Jan 28 01:14:02.211820 dracut-cmdline[378]: Using kernel command line parameters: SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack verity.usrhash=71544b7bf64a92b2aba342c16b083723a12bedf106d3ddb24ccb63046196f1b3 Jan 28 01:14:02.244805 systemd-resolved[376]: Positive Trust Anchors: Jan 28 01:14:02.244822 systemd-resolved[376]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 28 01:14:02.244825 systemd-resolved[376]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Jan 28 01:14:02.244856 systemd-resolved[376]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 28 01:14:02.270161 systemd-resolved[376]: Defaulting to hostname 'linux'. Jan 28 01:14:02.271880 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 28 01:14:02.272000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:02.272607 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 28 01:14:02.322075 kernel: Loading iSCSI transport class v2.0-870. Jan 28 01:14:02.340036 kernel: iscsi: registered transport (tcp) Jan 28 01:14:02.369579 kernel: iscsi: registered transport (qla4xxx) Jan 28 01:14:02.370395 kernel: QLogic iSCSI HBA Driver Jan 28 01:14:02.399218 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 28 01:14:02.417108 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 28 01:14:02.417000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:02.419775 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 28 01:14:02.465397 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Jan 28 01:14:02.466000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:02.467908 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Jan 28 01:14:02.470108 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Jan 28 01:14:02.503826 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Jan 28 01:14:02.504000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:02.505000 audit: BPF prog-id=7 op=LOAD Jan 28 01:14:02.505000 audit: BPF prog-id=8 op=LOAD Jan 28 01:14:02.505736 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 28 01:14:02.535131 systemd-udevd[608]: Using default interface naming scheme 'v257'. Jan 28 01:14:02.544712 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 28 01:14:02.545000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:02.546427 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Jan 28 01:14:02.572302 dracut-pre-trigger[669]: rd.md=0: removing MD RAID activation Jan 28 01:14:02.587354 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 28 01:14:02.589000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:02.590000 audit: BPF prog-id=9 op=LOAD Jan 28 01:14:02.591136 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 28 01:14:02.605203 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Jan 28 01:14:02.606000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:02.609196 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 28 01:14:02.638436 systemd-networkd[737]: lo: Link UP Jan 28 01:14:02.640000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:02.639323 systemd-networkd[737]: lo: Gained carrier Jan 28 01:14:02.639820 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 28 01:14:02.640515 systemd[1]: Reached target network.target - Network. Jan 28 01:14:02.701636 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 28 01:14:02.704000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:02.707152 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Jan 28 01:14:02.839381 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Jan 28 01:14:02.851264 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Jan 28 01:14:02.862138 kernel: cryptd: max_cpu_qlen set to 1000 Jan 28 01:14:02.861972 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Jan 28 01:14:02.875519 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Jan 28 01:14:02.884277 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input2 Jan 28 01:14:02.884304 kernel: hid: raw HID events driver (C) Jiri Kosina Jan 28 01:14:02.880437 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Jan 28 01:14:02.903020 kernel: usbcore: registered new interface driver usbhid Jan 28 01:14:02.903091 kernel: usbhid: USB HID core driver Jan 28 01:14:02.905026 kernel: AES CTR mode by8 optimization enabled Jan 28 01:14:02.905218 disk-uuid[790]: Primary Header is updated. Jan 28 01:14:02.905218 disk-uuid[790]: Secondary Entries is updated. Jan 28 01:14:02.905218 disk-uuid[790]: Secondary Header is updated. Jan 28 01:14:02.926406 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 28 01:14:02.933000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:02.926538 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 28 01:14:02.933155 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jan 28 01:14:02.941251 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 28 01:14:02.953194 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.0/0000:01:00.0/0000:02:01.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input3 Jan 28 01:14:02.953229 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:01.0-1/input0 Jan 28 01:14:02.942110 systemd-networkd[737]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 28 01:14:02.942114 systemd-networkd[737]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 28 01:14:02.943942 systemd-networkd[737]: eth0: Link UP Jan 28 01:14:02.944593 systemd-networkd[737]: eth0: Gained carrier Jan 28 01:14:02.944606 systemd-networkd[737]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 28 01:14:02.960064 systemd-networkd[737]: eth0: DHCPv4 address 10.0.0.143/25, gateway 10.0.0.129 acquired from 10.0.0.129 Jan 28 01:14:02.990601 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 28 01:14:02.991290 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 28 01:14:02.993000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:02.993000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:02.997108 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 28 01:14:03.026251 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 28 01:14:03.027000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:03.071600 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Jan 28 01:14:03.072000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:03.073546 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Jan 28 01:14:03.074019 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 28 01:14:03.075490 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 28 01:14:03.076940 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Jan 28 01:14:03.109788 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Jan 28 01:14:03.110000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:04.009831 disk-uuid[793]: Warning: The kernel is still using the old partition table. Jan 28 01:14:04.009831 disk-uuid[793]: The new table will be used at the next reboot or after you Jan 28 01:14:04.009831 disk-uuid[793]: run partprobe(8) or kpartx(8) Jan 28 01:14:04.009831 disk-uuid[793]: The operation has completed successfully. Jan 28 01:14:04.015933 systemd[1]: disk-uuid.service: Deactivated successfully. Jan 28 01:14:04.024069 kernel: kauditd_printk_skb: 18 callbacks suppressed Jan 28 01:14:04.024107 kernel: audit: type=1130 audit(1769562844.016:29): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:04.024122 kernel: audit: type=1131 audit(1769562844.016:30): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:04.016000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:04.016000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:04.016088 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Jan 28 01:14:04.019179 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Jan 28 01:14:04.076031 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (912) Jan 28 01:14:04.080235 kernel: BTRFS info (device vda6): first mount of filesystem 886243c7-f2f0-4861-ae6f-419cdf70e432 Jan 28 01:14:04.080293 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Jan 28 01:14:04.088344 kernel: BTRFS info (device vda6): turning on async discard Jan 28 01:14:04.088409 kernel: BTRFS info (device vda6): enabling free space tree Jan 28 01:14:04.095031 kernel: BTRFS info (device vda6): last unmount of filesystem 886243c7-f2f0-4861-ae6f-419cdf70e432 Jan 28 01:14:04.095423 systemd[1]: Finished ignition-setup.service - Ignition (setup). Jan 28 01:14:04.099656 kernel: audit: type=1130 audit(1769562844.095:31): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:04.095000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:04.098154 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Jan 28 01:14:04.334926 ignition[931]: Ignition 2.24.0 Jan 28 01:14:04.334941 ignition[931]: Stage: fetch-offline Jan 28 01:14:04.343385 kernel: audit: type=1130 audit(1769562844.339:32): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:04.339000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:04.338060 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Jan 28 01:14:04.334992 ignition[931]: no configs at "/usr/lib/ignition/base.d" Jan 28 01:14:04.336051 ignition[931]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 28 01:14:04.344144 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Jan 28 01:14:04.336183 ignition[931]: parsed url from cmdline: "" Jan 28 01:14:04.336188 ignition[931]: no config URL provided Jan 28 01:14:04.336195 ignition[931]: reading system config file "/usr/lib/ignition/user.ign" Jan 28 01:14:04.336209 ignition[931]: no config at "/usr/lib/ignition/user.ign" Jan 28 01:14:04.336215 ignition[931]: failed to fetch config: resource requires networking Jan 28 01:14:04.336390 ignition[931]: Ignition finished successfully Jan 28 01:14:04.374264 ignition[938]: Ignition 2.24.0 Jan 28 01:14:04.375052 ignition[938]: Stage: fetch Jan 28 01:14:04.375238 ignition[938]: no configs at "/usr/lib/ignition/base.d" Jan 28 01:14:04.375246 ignition[938]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 28 01:14:04.375327 ignition[938]: parsed url from cmdline: "" Jan 28 01:14:04.375330 ignition[938]: no config URL provided Jan 28 01:14:04.375335 ignition[938]: reading system config file "/usr/lib/ignition/user.ign" Jan 28 01:14:04.375340 ignition[938]: no config at "/usr/lib/ignition/user.ign" Jan 28 01:14:04.375418 ignition[938]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Jan 28 01:14:04.376506 ignition[938]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Jan 28 01:14:04.376524 ignition[938]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Jan 28 01:14:04.850250 systemd-networkd[737]: eth0: Gained IPv6LL Jan 28 01:14:05.200971 ignition[938]: GET result: OK Jan 28 01:14:05.201165 ignition[938]: parsing config with SHA512: 13c6a0ad64eb40a97389c1f90674e26f5f89f93b4ce5fae5634ad62dcfc0279badea1da798690c7cc3b31f0a3f4ceada81ae059105bf2388f3f2c1f11a85dc78 Jan 28 01:14:05.211713 unknown[938]: fetched base config from "system" Jan 28 01:14:05.211726 unknown[938]: fetched base config from "system" Jan 28 01:14:05.212144 ignition[938]: fetch: fetch complete Jan 28 01:14:05.211732 unknown[938]: fetched user config from "openstack" Jan 28 01:14:05.212149 ignition[938]: fetch: fetch passed Jan 28 01:14:05.212204 ignition[938]: Ignition finished successfully Jan 28 01:14:05.215335 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Jan 28 01:14:05.215000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:05.216927 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Jan 28 01:14:05.221781 kernel: audit: type=1130 audit(1769562845.215:33): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:05.254412 ignition[945]: Ignition 2.24.0 Jan 28 01:14:05.254424 ignition[945]: Stage: kargs Jan 28 01:14:05.254574 ignition[945]: no configs at "/usr/lib/ignition/base.d" Jan 28 01:14:05.254582 ignition[945]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 28 01:14:05.255376 ignition[945]: kargs: kargs passed Jan 28 01:14:05.258000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:05.257444 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Jan 28 01:14:05.255417 ignition[945]: Ignition finished successfully Jan 28 01:14:05.264419 kernel: audit: type=1130 audit(1769562845.258:34): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:05.262141 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Jan 28 01:14:05.284429 ignition[951]: Ignition 2.24.0 Jan 28 01:14:05.285070 ignition[951]: Stage: disks Jan 28 01:14:05.285235 ignition[951]: no configs at "/usr/lib/ignition/base.d" Jan 28 01:14:05.285243 ignition[951]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 28 01:14:05.287291 ignition[951]: disks: disks passed Jan 28 01:14:05.287698 ignition[951]: Ignition finished successfully Jan 28 01:14:05.288908 systemd[1]: Finished ignition-disks.service - Ignition (disks). Jan 28 01:14:05.289000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:05.289750 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Jan 28 01:14:05.295298 kernel: audit: type=1130 audit(1769562845.289:35): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:05.294953 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Jan 28 01:14:05.295609 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 28 01:14:05.296319 systemd[1]: Reached target sysinit.target - System Initialization. Jan 28 01:14:05.296946 systemd[1]: Reached target basic.target - Basic System. Jan 28 01:14:05.298579 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Jan 28 01:14:05.353455 systemd-fsck[959]: ROOT: clean, 15/1631200 files, 112378/1617920 blocks Jan 28 01:14:05.359161 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Jan 28 01:14:05.359000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:05.360843 systemd[1]: Mounting sysroot.mount - /sysroot... Jan 28 01:14:05.365200 kernel: audit: type=1130 audit(1769562845.359:36): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:05.547025 kernel: EXT4-fs (vda9): mounted filesystem 60a46795-cc10-4076-a709-d039d1c23a6b r/w with ordered data mode. Quota mode: none. Jan 28 01:14:05.547515 systemd[1]: Mounted sysroot.mount - /sysroot. Jan 28 01:14:05.548702 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Jan 28 01:14:05.553113 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 28 01:14:05.556085 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Jan 28 01:14:05.557165 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Jan 28 01:14:05.561127 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Jan 28 01:14:05.562069 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Jan 28 01:14:05.562869 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Jan 28 01:14:05.564920 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Jan 28 01:14:05.568129 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Jan 28 01:14:05.584628 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (967) Jan 28 01:14:05.588028 kernel: BTRFS info (device vda6): first mount of filesystem 886243c7-f2f0-4861-ae6f-419cdf70e432 Jan 28 01:14:05.590031 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Jan 28 01:14:05.599644 kernel: BTRFS info (device vda6): turning on async discard Jan 28 01:14:05.599701 kernel: BTRFS info (device vda6): enabling free space tree Jan 28 01:14:05.601739 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 28 01:14:05.676034 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 28 01:14:05.825932 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Jan 28 01:14:05.830425 kernel: audit: type=1130 audit(1769562845.826:37): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:05.826000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:05.829095 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Jan 28 01:14:05.832122 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Jan 28 01:14:05.849829 systemd[1]: sysroot-oem.mount: Deactivated successfully. Jan 28 01:14:05.851812 kernel: BTRFS info (device vda6): last unmount of filesystem 886243c7-f2f0-4861-ae6f-419cdf70e432 Jan 28 01:14:05.877260 ignition[1067]: INFO : Ignition 2.24.0 Jan 28 01:14:05.878757 ignition[1067]: INFO : Stage: mount Jan 28 01:14:05.880345 ignition[1067]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 28 01:14:05.880345 ignition[1067]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 28 01:14:05.881772 ignition[1067]: INFO : mount: mount passed Jan 28 01:14:05.881772 ignition[1067]: INFO : Ignition finished successfully Jan 28 01:14:05.886306 kernel: audit: type=1130 audit(1769562845.882:38): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:05.882000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:05.881775 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Jan 28 01:14:05.886000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:05.883109 systemd[1]: Finished ignition-mount.service - Ignition (mount). Jan 28 01:14:06.734032 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 28 01:14:08.746050 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 28 01:14:12.757046 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 28 01:14:12.767031 coreos-metadata[969]: Jan 28 01:14:12.766 WARN failed to locate config-drive, using the metadata service API instead Jan 28 01:14:12.792433 coreos-metadata[969]: Jan 28 01:14:12.792 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Jan 28 01:14:13.438649 coreos-metadata[969]: Jan 28 01:14:13.438 INFO Fetch successful Jan 28 01:14:13.438649 coreos-metadata[969]: Jan 28 01:14:13.438 INFO wrote hostname ci-4593-0-0-n-62761e1650 to /sysroot/etc/hostname Jan 28 01:14:13.440995 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Jan 28 01:14:13.449487 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 28 01:14:13.449515 kernel: audit: type=1130 audit(1769562853.441:40): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:13.449529 kernel: audit: type=1131 audit(1769562853.441:41): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:13.441000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:13.441000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:13.441123 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Jan 28 01:14:13.442204 systemd[1]: Starting ignition-files.service - Ignition (files)... Jan 28 01:14:13.463058 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 28 01:14:13.500038 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (1084) Jan 28 01:14:13.505892 kernel: BTRFS info (device vda6): first mount of filesystem 886243c7-f2f0-4861-ae6f-419cdf70e432 Jan 28 01:14:13.505961 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Jan 28 01:14:13.513464 kernel: BTRFS info (device vda6): turning on async discard Jan 28 01:14:13.514101 kernel: BTRFS info (device vda6): enabling free space tree Jan 28 01:14:13.515615 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 28 01:14:13.548493 ignition[1101]: INFO : Ignition 2.24.0 Jan 28 01:14:13.548493 ignition[1101]: INFO : Stage: files Jan 28 01:14:13.550167 ignition[1101]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 28 01:14:13.550167 ignition[1101]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 28 01:14:13.550167 ignition[1101]: DEBUG : files: compiled without relabeling support, skipping Jan 28 01:14:13.552081 ignition[1101]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Jan 28 01:14:13.552081 ignition[1101]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Jan 28 01:14:13.561250 ignition[1101]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Jan 28 01:14:13.561824 ignition[1101]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Jan 28 01:14:13.561824 ignition[1101]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Jan 28 01:14:13.561614 unknown[1101]: wrote ssh authorized keys file for user: core Jan 28 01:14:13.564423 ignition[1101]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Jan 28 01:14:13.565320 ignition[1101]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-amd64.tar.gz: attempt #1 Jan 28 01:14:13.627585 ignition[1101]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Jan 28 01:14:13.748223 ignition[1101]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Jan 28 01:14:13.748223 ignition[1101]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Jan 28 01:14:13.750437 ignition[1101]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Jan 28 01:14:13.750437 ignition[1101]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Jan 28 01:14:13.750437 ignition[1101]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Jan 28 01:14:13.750437 ignition[1101]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 28 01:14:13.750437 ignition[1101]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 28 01:14:13.750437 ignition[1101]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 28 01:14:13.750437 ignition[1101]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 28 01:14:13.753719 ignition[1101]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Jan 28 01:14:13.753719 ignition[1101]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Jan 28 01:14:13.753719 ignition[1101]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Jan 28 01:14:13.753719 ignition[1101]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Jan 28 01:14:13.753719 ignition[1101]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Jan 28 01:14:13.753719 ignition[1101]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.0-x86-64.raw: attempt #1 Jan 28 01:14:13.856559 ignition[1101]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Jan 28 01:14:14.506829 ignition[1101]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Jan 28 01:14:14.506829 ignition[1101]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Jan 28 01:14:14.510570 ignition[1101]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 28 01:14:14.513715 ignition[1101]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 28 01:14:14.513715 ignition[1101]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Jan 28 01:14:14.513715 ignition[1101]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Jan 28 01:14:14.516116 ignition[1101]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Jan 28 01:14:14.516116 ignition[1101]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Jan 28 01:14:14.516116 ignition[1101]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Jan 28 01:14:14.516116 ignition[1101]: INFO : files: files passed Jan 28 01:14:14.516116 ignition[1101]: INFO : Ignition finished successfully Jan 28 01:14:14.524248 kernel: audit: type=1130 audit(1769562854.517:42): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:14.517000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:14.517072 systemd[1]: Finished ignition-files.service - Ignition (files). Jan 28 01:14:14.520656 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Jan 28 01:14:14.525603 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Jan 28 01:14:14.539039 systemd[1]: ignition-quench.service: Deactivated successfully. Jan 28 01:14:14.539865 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Jan 28 01:14:14.540000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:14.545048 kernel: audit: type=1130 audit(1769562854.540:43): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:14.545087 kernel: audit: type=1131 audit(1769562854.540:44): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:14.540000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:14.552832 initrd-setup-root-after-ignition[1133]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 28 01:14:14.552832 initrd-setup-root-after-ignition[1133]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Jan 28 01:14:14.554977 initrd-setup-root-after-ignition[1137]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 28 01:14:14.557420 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 28 01:14:14.562386 kernel: audit: type=1130 audit(1769562854.558:45): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:14.558000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:14.558183 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Jan 28 01:14:14.563839 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Jan 28 01:14:14.607089 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Jan 28 01:14:14.608105 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Jan 28 01:14:14.609000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:14.610863 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Jan 28 01:14:14.618237 kernel: audit: type=1130 audit(1769562854.609:46): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:14.618269 kernel: audit: type=1131 audit(1769562854.610:47): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:14.610000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:14.617093 systemd[1]: Reached target initrd.target - Initrd Default Target. Jan 28 01:14:14.618966 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Jan 28 01:14:14.620478 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Jan 28 01:14:14.646249 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 28 01:14:14.647000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:14.650120 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Jan 28 01:14:14.652517 kernel: audit: type=1130 audit(1769562854.647:48): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:14.668706 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Jan 28 01:14:14.669632 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Jan 28 01:14:14.670306 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 28 01:14:14.671388 systemd[1]: Stopped target timers.target - Timer Units. Jan 28 01:14:14.672336 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Jan 28 01:14:14.676940 kernel: audit: type=1131 audit(1769562854.673:49): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:14.673000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:14.672450 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 28 01:14:14.677041 systemd[1]: Stopped target initrd.target - Initrd Default Target. Jan 28 01:14:14.677607 systemd[1]: Stopped target basic.target - Basic System. Jan 28 01:14:14.678596 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Jan 28 01:14:14.679613 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Jan 28 01:14:14.680507 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Jan 28 01:14:14.681436 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Jan 28 01:14:14.682327 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Jan 28 01:14:14.683285 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Jan 28 01:14:14.684202 systemd[1]: Stopped target sysinit.target - System Initialization. Jan 28 01:14:14.685069 systemd[1]: Stopped target local-fs.target - Local File Systems. Jan 28 01:14:14.685936 systemd[1]: Stopped target swap.target - Swaps. Jan 28 01:14:14.687000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:14.686803 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Jan 28 01:14:14.686915 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Jan 28 01:14:14.688184 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Jan 28 01:14:14.689094 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 28 01:14:14.689814 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Jan 28 01:14:14.691000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:14.689918 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 28 01:14:14.690662 systemd[1]: dracut-initqueue.service: Deactivated successfully. Jan 28 01:14:14.692000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:14.690790 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Jan 28 01:14:14.693000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:14.691907 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Jan 28 01:14:14.692000 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 28 01:14:14.692860 systemd[1]: ignition-files.service: Deactivated successfully. Jan 28 01:14:14.692964 systemd[1]: Stopped ignition-files.service - Ignition (files). Jan 28 01:14:14.695217 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Jan 28 01:14:14.698252 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Jan 28 01:14:14.700000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:14.698730 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Jan 28 01:14:14.698871 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 28 01:14:14.700163 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Jan 28 01:14:14.701000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:14.700260 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Jan 28 01:14:14.702000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:14.701760 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Jan 28 01:14:14.701873 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Jan 28 01:14:14.707000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:14.707000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:14.706838 systemd[1]: initrd-cleanup.service: Deactivated successfully. Jan 28 01:14:14.706936 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Jan 28 01:14:14.721499 ignition[1157]: INFO : Ignition 2.24.0 Jan 28 01:14:14.721499 ignition[1157]: INFO : Stage: umount Jan 28 01:14:14.724067 ignition[1157]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 28 01:14:14.724067 ignition[1157]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 28 01:14:14.724067 ignition[1157]: INFO : umount: umount passed Jan 28 01:14:14.724067 ignition[1157]: INFO : Ignition finished successfully Jan 28 01:14:14.725000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:14.728000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:14.724743 systemd[1]: ignition-mount.service: Deactivated successfully. Jan 28 01:14:14.728000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:14.724899 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Jan 28 01:14:14.729000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:14.726798 systemd[1]: ignition-disks.service: Deactivated successfully. Jan 28 01:14:14.726901 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Jan 28 01:14:14.728159 systemd[1]: ignition-kargs.service: Deactivated successfully. Jan 28 01:14:14.733000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:14.728203 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Jan 28 01:14:14.728659 systemd[1]: ignition-fetch.service: Deactivated successfully. Jan 28 01:14:14.728699 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Jan 28 01:14:14.729693 systemd[1]: Stopped target network.target - Network. Jan 28 01:14:14.731492 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Jan 28 01:14:14.731551 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Jan 28 01:14:14.733117 systemd[1]: Stopped target paths.target - Path Units. Jan 28 01:14:14.734296 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Jan 28 01:14:14.738067 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 28 01:14:14.738535 systemd[1]: Stopped target slices.target - Slice Units. Jan 28 01:14:14.739450 systemd[1]: Stopped target sockets.target - Socket Units. Jan 28 01:14:14.740419 systemd[1]: iscsid.socket: Deactivated successfully. Jan 28 01:14:14.740461 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Jan 28 01:14:14.741350 systemd[1]: iscsiuio.socket: Deactivated successfully. Jan 28 01:14:14.741382 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 28 01:14:14.744000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:14.742255 systemd[1]: systemd-journald-audit.socket: Deactivated successfully. Jan 28 01:14:14.744000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup-pre comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:14.742278 systemd[1]: Closed systemd-journald-audit.socket - Journal Audit Socket. Jan 28 01:14:14.743167 systemd[1]: ignition-setup.service: Deactivated successfully. Jan 28 01:14:14.743212 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Jan 28 01:14:14.744102 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Jan 28 01:14:14.744138 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Jan 28 01:14:14.745104 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Jan 28 01:14:14.746224 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Jan 28 01:14:14.751000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:14.750114 systemd[1]: sysroot-boot.mount: Deactivated successfully. Jan 28 01:14:14.750689 systemd[1]: sysroot-boot.service: Deactivated successfully. Jan 28 01:14:14.753000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:14.750772 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Jan 28 01:14:14.752399 systemd[1]: initrd-setup-root.service: Deactivated successfully. Jan 28 01:14:14.752478 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Jan 28 01:14:14.756459 systemd[1]: systemd-networkd.service: Deactivated successfully. Jan 28 01:14:14.756557 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Jan 28 01:14:14.757000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:14.759595 systemd[1]: systemd-resolved.service: Deactivated successfully. Jan 28 01:14:14.760000 audit: BPF prog-id=9 op=UNLOAD Jan 28 01:14:14.759692 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Jan 28 01:14:14.760000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:14.761865 systemd[1]: Stopped target network-pre.target - Preparation for Network. Jan 28 01:14:14.762491 systemd[1]: systemd-networkd.socket: Deactivated successfully. Jan 28 01:14:14.763000 audit: BPF prog-id=6 op=UNLOAD Jan 28 01:14:14.762536 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Jan 28 01:14:14.765166 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Jan 28 01:14:14.765973 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Jan 28 01:14:14.766437 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 28 01:14:14.767000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:14.767313 systemd[1]: systemd-sysctl.service: Deactivated successfully. Jan 28 01:14:14.768000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:14.768000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:14.767775 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Jan 28 01:14:14.768177 systemd[1]: systemd-modules-load.service: Deactivated successfully. Jan 28 01:14:14.768214 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Jan 28 01:14:14.768708 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 28 01:14:14.783458 systemd[1]: systemd-udevd.service: Deactivated successfully. Jan 28 01:14:14.784213 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 28 01:14:14.785000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:14.785381 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Jan 28 01:14:14.785836 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Jan 28 01:14:14.786679 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Jan 28 01:14:14.786706 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Jan 28 01:14:14.788000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:14.787880 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Jan 28 01:14:14.787925 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Jan 28 01:14:14.788594 systemd[1]: dracut-cmdline.service: Deactivated successfully. Jan 28 01:14:14.790000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:14.790000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:14.788636 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Jan 28 01:14:14.790342 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 28 01:14:14.790389 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 28 01:14:14.791565 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Jan 28 01:14:14.794000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:14.795000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:14.794086 systemd[1]: systemd-network-generator.service: Deactivated successfully. Jan 28 01:14:14.794159 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Jan 28 01:14:14.794852 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Jan 28 01:14:14.797000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:14.797000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:14.794906 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 28 01:14:14.798000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:14.795315 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Jan 28 01:14:14.795349 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 28 01:14:14.797098 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Jan 28 01:14:14.797135 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Jan 28 01:14:14.798056 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 28 01:14:14.798095 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 28 01:14:14.800765 systemd[1]: network-cleanup.service: Deactivated successfully. Jan 28 01:14:14.803304 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Jan 28 01:14:14.804000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=network-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:14.810179 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Jan 28 01:14:14.810000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:14.810000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:14.810278 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Jan 28 01:14:14.811070 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Jan 28 01:14:14.812679 systemd[1]: Starting initrd-switch-root.service - Switch Root... Jan 28 01:14:14.848989 systemd[1]: Switching root. Jan 28 01:14:14.899534 systemd-journald[340]: Journal stopped Jan 28 01:14:16.187244 systemd-journald[340]: Received SIGTERM from PID 1 (systemd). Jan 28 01:14:16.187672 kernel: SELinux: policy capability network_peer_controls=1 Jan 28 01:14:16.187697 kernel: SELinux: policy capability open_perms=1 Jan 28 01:14:16.187713 kernel: SELinux: policy capability extended_socket_class=1 Jan 28 01:14:16.187729 kernel: SELinux: policy capability always_check_network=0 Jan 28 01:14:16.187739 kernel: SELinux: policy capability cgroup_seclabel=1 Jan 28 01:14:16.187754 kernel: SELinux: policy capability nnp_nosuid_transition=1 Jan 28 01:14:16.187766 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Jan 28 01:14:16.187779 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Jan 28 01:14:16.187794 kernel: SELinux: policy capability userspace_initial_context=0 Jan 28 01:14:16.187808 systemd[1]: Successfully loaded SELinux policy in 71.559ms. Jan 28 01:14:16.187831 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 5.473ms. Jan 28 01:14:16.187844 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jan 28 01:14:16.187857 systemd[1]: Detected virtualization kvm. Jan 28 01:14:16.187869 systemd[1]: Detected architecture x86-64. Jan 28 01:14:16.187880 systemd[1]: Detected first boot. Jan 28 01:14:16.187894 systemd[1]: Hostname set to . Jan 28 01:14:16.187905 systemd[1]: Initializing machine ID from SMBIOS/DMI UUID. Jan 28 01:14:16.187917 zram_generator::config[1201]: No configuration found. Jan 28 01:14:16.187936 kernel: Guest personality initialized and is inactive Jan 28 01:14:16.187947 kernel: VMCI host device registered (name=vmci, major=10, minor=258) Jan 28 01:14:16.187957 kernel: Initialized host personality Jan 28 01:14:16.187968 kernel: NET: Registered PF_VSOCK protocol family Jan 28 01:14:16.187980 systemd[1]: Populated /etc with preset unit settings. Jan 28 01:14:16.187997 systemd[1]: initrd-switch-root.service: Deactivated successfully. Jan 28 01:14:16.190025 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Jan 28 01:14:16.190040 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Jan 28 01:14:16.190056 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Jan 28 01:14:16.190069 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Jan 28 01:14:16.190080 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Jan 28 01:14:16.190093 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Jan 28 01:14:16.190105 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Jan 28 01:14:16.190116 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Jan 28 01:14:16.190133 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Jan 28 01:14:16.190143 systemd[1]: Created slice user.slice - User and Session Slice. Jan 28 01:14:16.190158 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 28 01:14:16.190170 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 28 01:14:16.190183 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Jan 28 01:14:16.190195 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Jan 28 01:14:16.190207 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Jan 28 01:14:16.190221 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 28 01:14:16.190232 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Jan 28 01:14:16.190245 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 28 01:14:16.190257 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 28 01:14:16.190269 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Jan 28 01:14:16.190280 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Jan 28 01:14:16.190292 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Jan 28 01:14:16.190303 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Jan 28 01:14:16.190314 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 28 01:14:16.190328 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 28 01:14:16.190339 systemd[1]: Reached target remote-veritysetup.target - Remote Verity Protected Volumes. Jan 28 01:14:16.190350 systemd[1]: Reached target slices.target - Slice Units. Jan 28 01:14:16.190367 systemd[1]: Reached target swap.target - Swaps. Jan 28 01:14:16.190378 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Jan 28 01:14:16.190390 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Jan 28 01:14:16.190402 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Jan 28 01:14:16.190418 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Jan 28 01:14:16.190429 systemd[1]: Listening on systemd-mountfsd.socket - DDI File System Mounter Socket. Jan 28 01:14:16.190441 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 28 01:14:16.190453 systemd[1]: Listening on systemd-nsresourced.socket - Namespace Resource Manager Socket. Jan 28 01:14:16.190464 systemd[1]: Listening on systemd-oomd.socket - Userspace Out-Of-Memory (OOM) Killer Socket. Jan 28 01:14:16.190476 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 28 01:14:16.190487 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 28 01:14:16.190499 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Jan 28 01:14:16.190512 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Jan 28 01:14:16.190523 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Jan 28 01:14:16.190534 systemd[1]: Mounting media.mount - External Media Directory... Jan 28 01:14:16.190545 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 28 01:14:16.190556 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Jan 28 01:14:16.190567 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Jan 28 01:14:16.190581 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Jan 28 01:14:16.190593 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Jan 28 01:14:16.190605 systemd[1]: Reached target machines.target - Containers. Jan 28 01:14:16.190616 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Jan 28 01:14:16.190629 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 28 01:14:16.190641 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 28 01:14:16.190653 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Jan 28 01:14:16.190666 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 28 01:14:16.190677 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 28 01:14:16.190687 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 28 01:14:16.190699 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Jan 28 01:14:16.190710 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 28 01:14:16.190724 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Jan 28 01:14:16.190735 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Jan 28 01:14:16.190751 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Jan 28 01:14:16.190763 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Jan 28 01:14:16.190774 systemd[1]: Stopped systemd-fsck-usr.service. Jan 28 01:14:16.190787 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 28 01:14:16.190799 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 28 01:14:16.190811 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 28 01:14:16.190822 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 28 01:14:16.190843 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Jan 28 01:14:16.190855 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Jan 28 01:14:16.190867 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 28 01:14:16.190882 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 28 01:14:16.190896 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Jan 28 01:14:16.190907 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Jan 28 01:14:16.190918 systemd[1]: Mounted media.mount - External Media Directory. Jan 28 01:14:16.190929 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Jan 28 01:14:16.190942 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Jan 28 01:14:16.190953 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Jan 28 01:14:16.190964 kernel: ACPI: bus type drm_connector registered Jan 28 01:14:16.190976 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 28 01:14:16.190988 systemd[1]: modprobe@configfs.service: Deactivated successfully. Jan 28 01:14:16.190999 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Jan 28 01:14:16.191029 kernel: fuse: init (API version 7.41) Jan 28 01:14:16.191042 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 28 01:14:16.191053 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 28 01:14:16.191065 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 28 01:14:16.191077 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 28 01:14:16.191088 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 28 01:14:16.191098 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 28 01:14:16.191109 systemd[1]: modprobe@fuse.service: Deactivated successfully. Jan 28 01:14:16.191123 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Jan 28 01:14:16.191134 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 28 01:14:16.191144 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 28 01:14:16.191156 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 28 01:14:16.191189 systemd-journald[1271]: Collecting audit messages is enabled. Jan 28 01:14:16.191219 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Jan 28 01:14:16.191232 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Jan 28 01:14:16.191245 systemd-journald[1271]: Journal started Jan 28 01:14:16.191268 systemd-journald[1271]: Runtime Journal (/run/log/journal/f1bbc2cc1d7f4404a4a0f9def3d0e3be) is 8M, max 77.9M, 69.9M free. Jan 28 01:14:15.909000 audit[1]: EVENT_LISTENER pid=1 uid=0 auid=4294967295 tty=(none) ses=4294967295 subj=system_u:system_r:kernel_t:s0 comm="systemd" exe="/usr/lib/systemd/systemd" nl-mcgrp=1 op=connect res=1 Jan 28 01:14:16.041000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:16.045000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:16.047000 audit: BPF prog-id=14 op=UNLOAD Jan 28 01:14:16.047000 audit: BPF prog-id=13 op=UNLOAD Jan 28 01:14:16.048000 audit: BPF prog-id=15 op=LOAD Jan 28 01:14:16.048000 audit: BPF prog-id=16 op=LOAD Jan 28 01:14:16.048000 audit: BPF prog-id=17 op=LOAD Jan 28 01:14:16.149000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:16.155000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:16.155000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:16.161000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:16.161000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:16.166000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:16.166000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:16.170000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:16.170000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:16.175000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:16.175000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:16.180000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:16.180000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:16.183000 audit: CONFIG_CHANGE op=set audit_enabled=1 old=1 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 res=1 Jan 28 01:14:16.183000 audit[1271]: SYSCALL arch=c000003e syscall=46 success=yes exit=60 a0=4 a1=7ffd7c65aa20 a2=4000 a3=0 items=0 ppid=1 pid=1271 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="systemd-journal" exe="/usr/lib/systemd/systemd-journald" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:14:16.183000 audit: PROCTITLE proctitle="/usr/lib/systemd/systemd-journald" Jan 28 01:14:16.184000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:16.187000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-remount-fs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:16.190000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-load-credentials comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:15.842235 systemd[1]: Queued start job for default target multi-user.target. Jan 28 01:14:16.194145 systemd[1]: Started systemd-journald.service - Journal Service. Jan 28 01:14:15.850955 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Jan 28 01:14:15.851393 systemd[1]: systemd-journald.service: Deactivated successfully. Jan 28 01:14:16.195000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:16.197526 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 28 01:14:16.198000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:16.214331 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 28 01:14:16.217088 systemd[1]: Listening on systemd-importd.socket - Disk Image Download Service Socket. Jan 28 01:14:16.217620 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Jan 28 01:14:16.217651 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 28 01:14:16.218985 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Jan 28 01:14:16.220380 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 28 01:14:16.220491 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 28 01:14:16.225149 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Jan 28 01:14:16.230137 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Jan 28 01:14:16.230630 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 28 01:14:16.231930 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Jan 28 01:14:16.232535 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 28 01:14:16.233820 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 28 01:14:16.241625 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Jan 28 01:14:16.244551 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 28 01:14:16.246190 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Jan 28 01:14:16.247000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=flatcar-tmpfiles comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:16.266749 systemd-journald[1271]: Time spent on flushing to /var/log/journal/f1bbc2cc1d7f4404a4a0f9def3d0e3be is 64.464ms for 1843 entries. Jan 28 01:14:16.266749 systemd-journald[1271]: System Journal (/var/log/journal/f1bbc2cc1d7f4404a4a0f9def3d0e3be) is 8M, max 588.1M, 580.1M free. Jan 28 01:14:16.341383 systemd-journald[1271]: Received client request to flush runtime journal. Jan 28 01:14:16.341435 kernel: loop1: detected capacity change from 0 to 1656 Jan 28 01:14:16.274000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:16.278000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-random-seed comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:16.305000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:16.327000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:16.273084 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 28 01:14:16.277844 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Jan 28 01:14:16.278612 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Jan 28 01:14:16.284225 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Jan 28 01:14:16.305321 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 28 01:14:16.319201 systemd-tmpfiles[1325]: ACLs are not supported, ignoring. Jan 28 01:14:16.319213 systemd-tmpfiles[1325]: ACLs are not supported, ignoring. Jan 28 01:14:16.325257 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 28 01:14:16.329798 systemd[1]: Starting systemd-sysusers.service - Create System Users... Jan 28 01:14:16.345343 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Jan 28 01:14:16.345000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-flush comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:16.350071 kernel: loop2: detected capacity change from 0 to 111560 Jan 28 01:14:16.364704 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Jan 28 01:14:16.365000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-machine-id-commit comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:16.396043 kernel: loop3: detected capacity change from 0 to 229808 Jan 28 01:14:16.401058 systemd[1]: Finished systemd-sysusers.service - Create System Users. Jan 28 01:14:16.401000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysusers comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:16.403000 audit: BPF prog-id=18 op=LOAD Jan 28 01:14:16.403000 audit: BPF prog-id=19 op=LOAD Jan 28 01:14:16.403000 audit: BPF prog-id=20 op=LOAD Jan 28 01:14:16.406152 systemd[1]: Starting systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer... Jan 28 01:14:16.407000 audit: BPF prog-id=21 op=LOAD Jan 28 01:14:16.409242 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 28 01:14:16.412284 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 28 01:14:16.417000 audit: BPF prog-id=22 op=LOAD Jan 28 01:14:16.417000 audit: BPF prog-id=23 op=LOAD Jan 28 01:14:16.417000 audit: BPF prog-id=24 op=LOAD Jan 28 01:14:16.419138 systemd[1]: Starting systemd-nsresourced.service - Namespace Resource Manager... Jan 28 01:14:16.420000 audit: BPF prog-id=25 op=LOAD Jan 28 01:14:16.421000 audit: BPF prog-id=26 op=LOAD Jan 28 01:14:16.421000 audit: BPF prog-id=27 op=LOAD Jan 28 01:14:16.422202 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Jan 28 01:14:16.456032 kernel: loop4: detected capacity change from 0 to 50784 Jan 28 01:14:16.472995 systemd-tmpfiles[1348]: ACLs are not supported, ignoring. Jan 28 01:14:16.473041 systemd-tmpfiles[1348]: ACLs are not supported, ignoring. Jan 28 01:14:16.482000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:16.481115 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 28 01:14:16.481600 systemd-nsresourced[1349]: Not setting up BPF subsystem, as functionality has been disabled at compile time. Jan 28 01:14:16.483374 systemd[1]: Started systemd-nsresourced.service - Namespace Resource Manager. Jan 28 01:14:16.485000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-nsresourced comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:16.498000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-userdbd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:16.497049 systemd[1]: Started systemd-userdbd.service - User Database Manager. Jan 28 01:14:16.512052 kernel: loop5: detected capacity change from 0 to 1656 Jan 28 01:14:16.524043 kernel: loop6: detected capacity change from 0 to 111560 Jan 28 01:14:16.554027 kernel: loop7: detected capacity change from 0 to 229808 Jan 28 01:14:16.588604 systemd-oomd[1346]: No swap; memory pressure usage will be degraded Jan 28 01:14:16.589131 systemd[1]: Started systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer. Jan 28 01:14:16.591000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-oomd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:16.593023 kernel: loop1: detected capacity change from 0 to 50784 Jan 28 01:14:16.606305 systemd-resolved[1347]: Positive Trust Anchors: Jan 28 01:14:16.606590 systemd-resolved[1347]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 28 01:14:16.606628 systemd-resolved[1347]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Jan 28 01:14:16.606775 systemd-resolved[1347]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 28 01:14:16.621037 (sd-merge)[1366]: Using extensions 'containerd-flatcar.raw', 'docker-flatcar.raw', 'kubernetes.raw', 'oem-stackit.raw'. Jan 28 01:14:16.634074 (sd-merge)[1366]: Merged extensions into '/usr'. Jan 28 01:14:16.635160 systemd-resolved[1347]: Using system hostname 'ci-4593-0-0-n-62761e1650'. Jan 28 01:14:16.638665 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 28 01:14:16.639000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:16.639620 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 28 01:14:16.640397 systemd[1]: Reload requested from client PID 1324 ('systemd-sysext') (unit systemd-sysext.service)... Jan 28 01:14:16.640481 systemd[1]: Reloading... Jan 28 01:14:16.706042 zram_generator::config[1396]: No configuration found. Jan 28 01:14:16.909942 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Jan 28 01:14:16.910489 systemd[1]: Reloading finished in 269 ms. Jan 28 01:14:16.945098 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Jan 28 01:14:16.945000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-hwdb-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:16.946257 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Jan 28 01:14:16.946000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:16.951464 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Jan 28 01:14:16.952996 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Jan 28 01:14:16.961847 systemd[1]: Starting ensure-sysext.service... Jan 28 01:14:16.965763 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 28 01:14:16.966000 audit: BPF prog-id=8 op=UNLOAD Jan 28 01:14:16.966000 audit: BPF prog-id=7 op=UNLOAD Jan 28 01:14:16.967000 audit: BPF prog-id=28 op=LOAD Jan 28 01:14:16.967000 audit: BPF prog-id=29 op=LOAD Jan 28 01:14:16.971144 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 28 01:14:16.974000 audit: BPF prog-id=30 op=LOAD Jan 28 01:14:16.974000 audit: BPF prog-id=15 op=UNLOAD Jan 28 01:14:16.974000 audit: BPF prog-id=31 op=LOAD Jan 28 01:14:16.974000 audit: BPF prog-id=32 op=LOAD Jan 28 01:14:16.974000 audit: BPF prog-id=16 op=UNLOAD Jan 28 01:14:16.974000 audit: BPF prog-id=17 op=UNLOAD Jan 28 01:14:16.976000 audit: BPF prog-id=33 op=LOAD Jan 28 01:14:16.976000 audit: BPF prog-id=22 op=UNLOAD Jan 28 01:14:16.976000 audit: BPF prog-id=34 op=LOAD Jan 28 01:14:16.976000 audit: BPF prog-id=35 op=LOAD Jan 28 01:14:16.976000 audit: BPF prog-id=23 op=UNLOAD Jan 28 01:14:16.976000 audit: BPF prog-id=24 op=UNLOAD Jan 28 01:14:16.976000 audit: BPF prog-id=36 op=LOAD Jan 28 01:14:16.976000 audit: BPF prog-id=25 op=UNLOAD Jan 28 01:14:16.976000 audit: BPF prog-id=37 op=LOAD Jan 28 01:14:16.976000 audit: BPF prog-id=38 op=LOAD Jan 28 01:14:16.976000 audit: BPF prog-id=26 op=UNLOAD Jan 28 01:14:16.976000 audit: BPF prog-id=27 op=UNLOAD Jan 28 01:14:16.977000 audit: BPF prog-id=39 op=LOAD Jan 28 01:14:16.977000 audit: BPF prog-id=21 op=UNLOAD Jan 28 01:14:16.979000 audit: BPF prog-id=40 op=LOAD Jan 28 01:14:16.979000 audit: BPF prog-id=18 op=UNLOAD Jan 28 01:14:16.979000 audit: BPF prog-id=41 op=LOAD Jan 28 01:14:16.979000 audit: BPF prog-id=42 op=LOAD Jan 28 01:14:16.979000 audit: BPF prog-id=19 op=UNLOAD Jan 28 01:14:16.979000 audit: BPF prog-id=20 op=UNLOAD Jan 28 01:14:16.982606 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Jan 28 01:14:16.984282 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Jan 28 01:14:16.987466 systemd-tmpfiles[1445]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Jan 28 01:14:16.987502 systemd-tmpfiles[1445]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Jan 28 01:14:16.987700 systemd-tmpfiles[1445]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Jan 28 01:14:16.988633 systemd-tmpfiles[1445]: ACLs are not supported, ignoring. Jan 28 01:14:16.988677 systemd-tmpfiles[1445]: ACLs are not supported, ignoring. Jan 28 01:14:16.989472 systemd[1]: Reload requested from client PID 1444 ('systemctl') (unit ensure-sysext.service)... Jan 28 01:14:16.989487 systemd[1]: Reloading... Jan 28 01:14:17.000396 systemd-tmpfiles[1445]: Detected autofs mount point /boot during canonicalization of boot. Jan 28 01:14:17.000538 systemd-tmpfiles[1445]: Skipping /boot Jan 28 01:14:17.008551 systemd-tmpfiles[1445]: Detected autofs mount point /boot during canonicalization of boot. Jan 28 01:14:17.008560 systemd-tmpfiles[1445]: Skipping /boot Jan 28 01:14:17.040923 systemd-udevd[1446]: Using default interface naming scheme 'v257'. Jan 28 01:14:17.074038 zram_generator::config[1488]: No configuration found. Jan 28 01:14:17.247048 kernel: mousedev: PS/2 mouse device common for all mice Jan 28 01:14:17.278509 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Jan 28 01:14:17.278920 systemd[1]: Reloading finished in 289 ms. Jan 28 01:14:17.282025 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input4 Jan 28 01:14:17.286376 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 28 01:14:17.287000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:17.288581 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 28 01:14:17.289021 kernel: ACPI: button: Power Button [PWRF] Jan 28 01:14:17.289000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:17.292000 audit: BPF prog-id=43 op=LOAD Jan 28 01:14:17.292000 audit: BPF prog-id=33 op=UNLOAD Jan 28 01:14:17.292000 audit: BPF prog-id=44 op=LOAD Jan 28 01:14:17.292000 audit: BPF prog-id=45 op=LOAD Jan 28 01:14:17.292000 audit: BPF prog-id=34 op=UNLOAD Jan 28 01:14:17.292000 audit: BPF prog-id=35 op=UNLOAD Jan 28 01:14:17.293000 audit: BPF prog-id=46 op=LOAD Jan 28 01:14:17.293000 audit: BPF prog-id=40 op=UNLOAD Jan 28 01:14:17.293000 audit: BPF prog-id=47 op=LOAD Jan 28 01:14:17.293000 audit: BPF prog-id=48 op=LOAD Jan 28 01:14:17.293000 audit: BPF prog-id=41 op=UNLOAD Jan 28 01:14:17.293000 audit: BPF prog-id=42 op=UNLOAD Jan 28 01:14:17.294000 audit: BPF prog-id=49 op=LOAD Jan 28 01:14:17.294000 audit: BPF prog-id=30 op=UNLOAD Jan 28 01:14:17.294000 audit: BPF prog-id=50 op=LOAD Jan 28 01:14:17.294000 audit: BPF prog-id=51 op=LOAD Jan 28 01:14:17.294000 audit: BPF prog-id=31 op=UNLOAD Jan 28 01:14:17.294000 audit: BPF prog-id=32 op=UNLOAD Jan 28 01:14:17.296000 audit: BPF prog-id=52 op=LOAD Jan 28 01:14:17.296000 audit: BPF prog-id=36 op=UNLOAD Jan 28 01:14:17.296000 audit: BPF prog-id=53 op=LOAD Jan 28 01:14:17.296000 audit: BPF prog-id=54 op=LOAD Jan 28 01:14:17.296000 audit: BPF prog-id=37 op=UNLOAD Jan 28 01:14:17.296000 audit: BPF prog-id=38 op=UNLOAD Jan 28 01:14:17.297000 audit: BPF prog-id=55 op=LOAD Jan 28 01:14:17.297000 audit: BPF prog-id=56 op=LOAD Jan 28 01:14:17.297000 audit: BPF prog-id=28 op=UNLOAD Jan 28 01:14:17.297000 audit: BPF prog-id=29 op=UNLOAD Jan 28 01:14:17.297000 audit: BPF prog-id=57 op=LOAD Jan 28 01:14:17.297000 audit: BPF prog-id=39 op=UNLOAD Jan 28 01:14:17.335554 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Jan 28 01:14:17.338405 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 28 01:14:17.340710 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 28 01:14:17.347326 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Jan 28 01:14:17.349409 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 28 01:14:17.353266 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 28 01:14:17.357553 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 28 01:14:17.361265 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 28 01:14:17.363231 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 28 01:14:17.363414 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 28 01:14:17.366433 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Jan 28 01:14:17.370465 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Jan 28 01:14:17.371060 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 28 01:14:17.374291 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Jan 28 01:14:17.376000 audit: BPF prog-id=58 op=LOAD Jan 28 01:14:17.380094 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 28 01:14:17.385758 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Jan 28 01:14:17.386375 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 28 01:14:17.390436 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 28 01:14:17.390576 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 28 01:14:17.390729 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 28 01:14:17.390854 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 28 01:14:17.390928 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 28 01:14:17.390996 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 28 01:14:17.397705 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 28 01:14:17.397897 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 28 01:14:17.401108 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 28 01:14:17.403700 systemd[1]: Starting modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm... Jan 28 01:14:17.404406 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 28 01:14:17.404559 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 28 01:14:17.404641 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 28 01:14:17.404775 systemd[1]: Reached target time-set.target - System Time Set. Jan 28 01:14:17.405449 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 28 01:14:17.412808 systemd[1]: Finished ensure-sysext.service. Jan 28 01:14:17.413000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=ensure-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:17.433000 audit[1564]: SYSTEM_BOOT pid=1564 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg=' comm="systemd-update-utmp" exe="/usr/lib/systemd/systemd-update-utmp" hostname=? addr=? terminal=? res=success' Jan 28 01:14:17.437325 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 28 01:14:17.438078 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 28 01:14:17.438000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:17.439000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:17.439375 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 28 01:14:17.439881 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 28 01:14:17.440000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:17.440000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:17.442684 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 28 01:14:17.447189 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Jan 28 01:14:17.447000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-update-utmp comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:17.457000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:17.457000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:17.456946 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 28 01:14:17.462000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck@dev-disk-by\x2dlabel-OEM comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:17.457201 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 28 01:14:17.457968 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 28 01:14:17.461794 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Jan 28 01:14:17.501813 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Jan 28 01:14:17.504000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-catalog-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:17.507680 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 28 01:14:17.507904 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 28 01:14:17.510401 kernel: pps_core: LinuxPPS API ver. 1 registered Jan 28 01:14:17.512549 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Jan 28 01:14:17.513000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:17.513000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:17.525957 kernel: PTP clock support registered Jan 28 01:14:17.533771 kernel: i801_smbus 0000:00:1f.3: Enabling SMBus device Jan 28 01:14:17.534223 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Jan 28 01:14:17.538921 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Jan 28 01:14:17.540000 systemd[1]: modprobe@ptp_kvm.service: Deactivated successfully. Jan 28 01:14:17.541027 systemd[1]: Finished modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm. Jan 28 01:14:17.542000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@ptp_kvm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:17.542000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@ptp_kvm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:17.544965 systemd-networkd[1562]: lo: Link UP Jan 28 01:14:17.544971 systemd-networkd[1562]: lo: Gained carrier Jan 28 01:14:17.547000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=add_rule key=(null) list=5 res=1 Jan 28 01:14:17.547000 audit[1602]: SYSCALL arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7ffcf5a75b40 a2=420 a3=0 items=0 ppid=1551 pid=1602 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:14:17.547000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 28 01:14:17.547703 augenrules[1602]: No rules Jan 28 01:14:17.550616 systemd[1]: audit-rules.service: Deactivated successfully. Jan 28 01:14:17.551430 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 28 01:14:17.552936 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 28 01:14:17.555326 systemd[1]: Reached target network.target - Network. Jan 28 01:14:17.557130 systemd-networkd[1562]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 28 01:14:17.557134 systemd-networkd[1562]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 28 01:14:17.557862 systemd-networkd[1562]: eth0: Link UP Jan 28 01:14:17.558013 systemd-networkd[1562]: eth0: Gained carrier Jan 28 01:14:17.558404 systemd-networkd[1562]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 28 01:14:17.559164 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Jan 28 01:14:17.562653 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Jan 28 01:14:17.575091 systemd-networkd[1562]: eth0: DHCPv4 address 10.0.0.143/25, gateway 10.0.0.129 acquired from 10.0.0.129 Jan 28 01:14:17.610160 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Jan 28 01:14:17.627084 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Jan 28 01:14:17.628087 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Jan 28 01:14:17.654203 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 28 01:14:17.716407 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 28 01:14:17.718365 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 28 01:14:17.725560 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 28 01:14:17.731033 kernel: [drm] pci: virtio-vga detected at 0000:00:01.0 Jan 28 01:14:17.734399 kernel: Console: switching to colour dummy device 80x25 Jan 28 01:14:17.750731 kernel: virtio-pci 0000:00:01.0: vgaarb: deactivate vga console Jan 28 01:14:17.751067 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Jan 28 01:14:17.751085 kernel: [drm] features: -context_init Jan 28 01:14:17.751108 kernel: [drm] number of scanouts: 1 Jan 28 01:14:17.751120 kernel: [drm] number of cap sets: 0 Jan 28 01:14:17.751134 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0 Jan 28 01:14:17.751150 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device Jan 28 01:14:17.751162 kernel: Console: switching to colour frame buffer device 160x50 Jan 28 01:14:17.751219 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Jan 28 01:14:17.805342 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 28 01:14:17.805901 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 28 01:14:17.813754 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 28 01:14:17.933221 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 28 01:14:18.319519 ldconfig[1556]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Jan 28 01:14:18.326595 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Jan 28 01:14:18.328830 systemd[1]: Starting systemd-update-done.service - Update is Completed... Jan 28 01:14:18.351445 systemd[1]: Finished systemd-update-done.service - Update is Completed. Jan 28 01:14:18.353277 systemd[1]: Reached target sysinit.target - System Initialization. Jan 28 01:14:18.353764 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Jan 28 01:14:18.353852 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Jan 28 01:14:18.353915 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Jan 28 01:14:18.354151 systemd[1]: Started logrotate.timer - Daily rotation of log files. Jan 28 01:14:18.354724 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Jan 28 01:14:18.354810 systemd[1]: Started systemd-sysupdate-reboot.timer - Reboot Automatically After System Update. Jan 28 01:14:18.354923 systemd[1]: Started systemd-sysupdate.timer - Automatic System Update. Jan 28 01:14:18.354971 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Jan 28 01:14:18.355177 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Jan 28 01:14:18.355203 systemd[1]: Reached target paths.target - Path Units. Jan 28 01:14:18.355423 systemd[1]: Reached target timers.target - Timer Units. Jan 28 01:14:18.357483 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Jan 28 01:14:18.358980 systemd[1]: Starting docker.socket - Docker Socket for the API... Jan 28 01:14:18.363023 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Jan 28 01:14:18.363640 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Jan 28 01:14:18.363726 systemd[1]: Reached target ssh-access.target - SSH Access Available. Jan 28 01:14:18.365717 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Jan 28 01:14:18.367393 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Jan 28 01:14:18.368671 systemd[1]: Listening on docker.socket - Docker Socket for the API. Jan 28 01:14:18.370786 systemd[1]: Reached target sockets.target - Socket Units. Jan 28 01:14:18.371829 systemd[1]: Reached target basic.target - Basic System. Jan 28 01:14:18.372633 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Jan 28 01:14:18.372763 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Jan 28 01:14:18.376069 systemd[1]: Starting chronyd.service - NTP client/server... Jan 28 01:14:18.380141 systemd[1]: Starting containerd.service - containerd container runtime... Jan 28 01:14:18.384768 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Jan 28 01:14:18.388249 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Jan 28 01:14:18.393031 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Jan 28 01:14:18.400240 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Jan 28 01:14:18.406560 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Jan 28 01:14:18.407494 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Jan 28 01:14:18.409597 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Jan 28 01:14:18.416151 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Jan 28 01:14:18.423231 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Jan 28 01:14:18.427704 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Jan 28 01:14:18.428030 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 28 01:14:18.437176 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Jan 28 01:14:18.444806 extend-filesystems[1647]: Found /dev/vda6 Jan 28 01:14:18.446243 systemd[1]: Starting systemd-logind.service - User Login Management... Jan 28 01:14:18.449085 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Jan 28 01:14:18.449654 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Jan 28 01:14:18.451964 chronyd[1641]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +NTS +SECHASH +IPV6 -DEBUG) Jan 28 01:14:18.453172 systemd[1]: Starting update-engine.service - Update Engine... Jan 28 01:14:18.456575 jq[1646]: false Jan 28 01:14:18.458953 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Jan 28 01:14:18.464880 extend-filesystems[1647]: Found /dev/vda9 Jan 28 01:14:18.463682 chronyd[1641]: Loaded seccomp filter (level 2) Jan 28 01:14:18.474850 google_oslogin_nss_cache[1648]: oslogin_cache_refresh[1648]: Refreshing passwd entry cache Jan 28 01:14:18.475042 extend-filesystems[1647]: Checking size of /dev/vda9 Jan 28 01:14:18.465056 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Jan 28 01:14:18.473192 oslogin_cache_refresh[1648]: Refreshing passwd entry cache Jan 28 01:14:18.475647 systemd[1]: Started chronyd.service - NTP client/server. Jan 28 01:14:18.476287 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Jan 28 01:14:18.476559 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Jan 28 01:14:18.499725 google_oslogin_nss_cache[1648]: oslogin_cache_refresh[1648]: Failure getting users, quitting Jan 28 01:14:18.499725 google_oslogin_nss_cache[1648]: oslogin_cache_refresh[1648]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Jan 28 01:14:18.499707 oslogin_cache_refresh[1648]: Failure getting users, quitting Jan 28 01:14:18.499888 google_oslogin_nss_cache[1648]: oslogin_cache_refresh[1648]: Refreshing group entry cache Jan 28 01:14:18.499727 oslogin_cache_refresh[1648]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Jan 28 01:14:18.499782 oslogin_cache_refresh[1648]: Refreshing group entry cache Jan 28 01:14:18.508519 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Jan 28 01:14:18.509334 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Jan 28 01:14:18.509730 google_oslogin_nss_cache[1648]: oslogin_cache_refresh[1648]: Failure getting groups, quitting Jan 28 01:14:18.509727 oslogin_cache_refresh[1648]: Failure getting groups, quitting Jan 28 01:14:18.509811 google_oslogin_nss_cache[1648]: oslogin_cache_refresh[1648]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Jan 28 01:14:18.509741 oslogin_cache_refresh[1648]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Jan 28 01:14:18.510862 systemd[1]: motdgen.service: Deactivated successfully. Jan 28 01:14:18.512157 jq[1661]: true Jan 28 01:14:18.513173 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Jan 28 01:14:18.513996 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Jan 28 01:14:18.514207 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Jan 28 01:14:18.526406 extend-filesystems[1647]: Resized partition /dev/vda9 Jan 28 01:14:18.533451 extend-filesystems[1691]: resize2fs 1.47.3 (8-Jul-2025) Jan 28 01:14:18.543562 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 11516923 blocks Jan 28 01:14:18.550078 update_engine[1660]: I20260128 01:14:18.547945 1660 main.cc:92] Flatcar Update Engine starting Jan 28 01:14:18.555671 tar[1666]: linux-amd64/LICENSE Jan 28 01:14:18.579069 dbus-daemon[1644]: [system] SELinux support is enabled Jan 28 01:14:18.597395 update_engine[1660]: I20260128 01:14:18.595336 1660 update_check_scheduler.cc:74] Next update check in 3m15s Jan 28 01:14:18.583194 systemd[1]: Started dbus.service - D-Bus System Message Bus. Jan 28 01:14:18.597518 jq[1687]: true Jan 28 01:14:18.603572 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Jan 28 01:14:18.608239 tar[1666]: linux-amd64/helm Jan 28 01:14:18.604243 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Jan 28 01:14:18.607510 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Jan 28 01:14:18.607532 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Jan 28 01:14:18.609758 systemd[1]: Started update-engine.service - Update Engine. Jan 28 01:14:18.635770 systemd[1]: Started locksmithd.service - Cluster reboot manager. Jan 28 01:14:18.684654 systemd-logind[1658]: New seat seat0. Jan 28 01:14:18.737071 systemd-logind[1658]: Watching system buttons on /dev/input/event3 (Power Button) Jan 28 01:14:18.737110 systemd-logind[1658]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Jan 28 01:14:18.737513 systemd[1]: Started systemd-logind.service - User Login Management. Jan 28 01:14:18.790498 locksmithd[1715]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Jan 28 01:14:18.818253 bash[1716]: Updated "/home/core/.ssh/authorized_keys" Jan 28 01:14:18.821388 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Jan 28 01:14:18.825484 systemd[1]: Starting sshkeys.service... Jan 28 01:14:18.862861 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Jan 28 01:14:18.867569 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Jan 28 01:14:18.896422 containerd[1674]: time="2026-01-28T01:14:18Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Jan 28 01:14:18.899202 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 28 01:14:18.899292 containerd[1674]: time="2026-01-28T01:14:18.897362427Z" level=info msg="starting containerd" revision=fcd43222d6b07379a4be9786bda52438f0dd16a1 version=v2.1.5 Jan 28 01:14:18.920864 containerd[1674]: time="2026-01-28T01:14:18.920816566Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="10.223µs" Jan 28 01:14:18.921148 containerd[1674]: time="2026-01-28T01:14:18.921133158Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Jan 28 01:14:18.921218 containerd[1674]: time="2026-01-28T01:14:18.921209095Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Jan 28 01:14:18.922092 containerd[1674]: time="2026-01-28T01:14:18.921676052Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Jan 28 01:14:18.922092 containerd[1674]: time="2026-01-28T01:14:18.921820381Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Jan 28 01:14:18.922092 containerd[1674]: time="2026-01-28T01:14:18.921834456Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jan 28 01:14:18.922092 containerd[1674]: time="2026-01-28T01:14:18.921882921Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jan 28 01:14:18.922092 containerd[1674]: time="2026-01-28T01:14:18.921892393Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jan 28 01:14:18.923282 containerd[1674]: time="2026-01-28T01:14:18.923265335Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jan 28 01:14:18.923325 containerd[1674]: time="2026-01-28T01:14:18.923316853Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jan 28 01:14:18.923370 containerd[1674]: time="2026-01-28T01:14:18.923360903Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jan 28 01:14:18.923401 containerd[1674]: time="2026-01-28T01:14:18.923394499Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Jan 28 01:14:18.923598 containerd[1674]: time="2026-01-28T01:14:18.923584532Z" level=info msg="skip loading plugin" error="EROFS unsupported, please `modprobe erofs`: skip plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Jan 28 01:14:18.924105 containerd[1674]: time="2026-01-28T01:14:18.924037445Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Jan 28 01:14:18.924184 containerd[1674]: time="2026-01-28T01:14:18.924171678Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Jan 28 01:14:18.925033 containerd[1674]: time="2026-01-28T01:14:18.925000817Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jan 28 01:14:18.925114 containerd[1674]: time="2026-01-28T01:14:18.925103992Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jan 28 01:14:18.925178 containerd[1674]: time="2026-01-28T01:14:18.925170738Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Jan 28 01:14:18.925275 containerd[1674]: time="2026-01-28T01:14:18.925267290Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Jan 28 01:14:18.929038 containerd[1674]: time="2026-01-28T01:14:18.927277054Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Jan 28 01:14:18.929038 containerd[1674]: time="2026-01-28T01:14:18.927377561Z" level=info msg="metadata content store policy set" policy=shared Jan 28 01:14:18.974739 containerd[1674]: time="2026-01-28T01:14:18.974683502Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Jan 28 01:14:18.974739 containerd[1674]: time="2026-01-28T01:14:18.974754200Z" level=info msg="loading plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Jan 28 01:14:18.974886 containerd[1674]: time="2026-01-28T01:14:18.974858496Z" level=info msg="skip loading plugin" error="could not find mkfs.erofs: exec: \"mkfs.erofs\": executable file not found in $PATH: skip plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Jan 28 01:14:18.974886 containerd[1674]: time="2026-01-28T01:14:18.974873061Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Jan 28 01:14:18.974921 containerd[1674]: time="2026-01-28T01:14:18.974885394Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Jan 28 01:14:18.974921 containerd[1674]: time="2026-01-28T01:14:18.974896669Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Jan 28 01:14:18.974921 containerd[1674]: time="2026-01-28T01:14:18.974907289Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Jan 28 01:14:18.974921 containerd[1674]: time="2026-01-28T01:14:18.974915305Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Jan 28 01:14:18.975012 containerd[1674]: time="2026-01-28T01:14:18.974926788Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Jan 28 01:14:18.975012 containerd[1674]: time="2026-01-28T01:14:18.974938021Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Jan 28 01:14:18.975012 containerd[1674]: time="2026-01-28T01:14:18.974957870Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Jan 28 01:14:18.975012 containerd[1674]: time="2026-01-28T01:14:18.974968950Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Jan 28 01:14:18.975012 containerd[1674]: time="2026-01-28T01:14:18.974976711Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Jan 28 01:14:18.975012 containerd[1674]: time="2026-01-28T01:14:18.974986993Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Jan 28 01:14:18.975161 containerd[1674]: time="2026-01-28T01:14:18.975145982Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Jan 28 01:14:18.975193 containerd[1674]: time="2026-01-28T01:14:18.975168587Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Jan 28 01:14:18.975193 containerd[1674]: time="2026-01-28T01:14:18.975181462Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Jan 28 01:14:18.975234 containerd[1674]: time="2026-01-28T01:14:18.975190827Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Jan 28 01:14:18.975234 containerd[1674]: time="2026-01-28T01:14:18.975201530Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Jan 28 01:14:18.975234 containerd[1674]: time="2026-01-28T01:14:18.975213892Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Jan 28 01:14:18.975234 containerd[1674]: time="2026-01-28T01:14:18.975229710Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Jan 28 01:14:18.975299 containerd[1674]: time="2026-01-28T01:14:18.975240722Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Jan 28 01:14:18.975299 containerd[1674]: time="2026-01-28T01:14:18.975250093Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Jan 28 01:14:18.975299 containerd[1674]: time="2026-01-28T01:14:18.975259151Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Jan 28 01:14:18.975299 containerd[1674]: time="2026-01-28T01:14:18.975267353Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Jan 28 01:14:18.975299 containerd[1674]: time="2026-01-28T01:14:18.975289304Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Jan 28 01:14:18.975380 containerd[1674]: time="2026-01-28T01:14:18.975333237Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Jan 28 01:14:18.975380 containerd[1674]: time="2026-01-28T01:14:18.975344970Z" level=info msg="Start snapshots syncer" Jan 28 01:14:18.975380 containerd[1674]: time="2026-01-28T01:14:18.975363412Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Jan 28 01:14:18.975671 containerd[1674]: time="2026-01-28T01:14:18.975632608Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"cgroupWritable\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"\",\"binDirs\":[\"/opt/cni/bin\"],\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogLineSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Jan 28 01:14:18.976174 containerd[1674]: time="2026-01-28T01:14:18.975680390Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Jan 28 01:14:18.976174 containerd[1674]: time="2026-01-28T01:14:18.975725977Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Jan 28 01:14:18.976174 containerd[1674]: time="2026-01-28T01:14:18.975813641Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Jan 28 01:14:18.976174 containerd[1674]: time="2026-01-28T01:14:18.975830533Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Jan 28 01:14:18.976174 containerd[1674]: time="2026-01-28T01:14:18.975839523Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Jan 28 01:14:18.976174 containerd[1674]: time="2026-01-28T01:14:18.975848201Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Jan 28 01:14:18.976174 containerd[1674]: time="2026-01-28T01:14:18.975858529Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Jan 28 01:14:18.976174 containerd[1674]: time="2026-01-28T01:14:18.975867365Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Jan 28 01:14:18.976174 containerd[1674]: time="2026-01-28T01:14:18.975875936Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Jan 28 01:14:18.976174 containerd[1674]: time="2026-01-28T01:14:18.975884544Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Jan 28 01:14:18.976174 containerd[1674]: time="2026-01-28T01:14:18.975893538Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Jan 28 01:14:18.976174 containerd[1674]: time="2026-01-28T01:14:18.975923660Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jan 28 01:14:18.976174 containerd[1674]: time="2026-01-28T01:14:18.975934868Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jan 28 01:14:18.976174 containerd[1674]: time="2026-01-28T01:14:18.975942890Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jan 28 01:14:18.976722 containerd[1674]: time="2026-01-28T01:14:18.975956721Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jan 28 01:14:18.976722 containerd[1674]: time="2026-01-28T01:14:18.975964398Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Jan 28 01:14:18.976722 containerd[1674]: time="2026-01-28T01:14:18.975981583Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Jan 28 01:14:18.976722 containerd[1674]: time="2026-01-28T01:14:18.975992473Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Jan 28 01:14:18.978207 containerd[1674]: time="2026-01-28T01:14:18.978187739Z" level=info msg="runtime interface created" Jan 28 01:14:18.978207 containerd[1674]: time="2026-01-28T01:14:18.978206434Z" level=info msg="created NRI interface" Jan 28 01:14:18.978365 containerd[1674]: time="2026-01-28T01:14:18.978219707Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Jan 28 01:14:18.978365 containerd[1674]: time="2026-01-28T01:14:18.978238682Z" level=info msg="Connect containerd service" Jan 28 01:14:18.978365 containerd[1674]: time="2026-01-28T01:14:18.978265345Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Jan 28 01:14:18.979170 containerd[1674]: time="2026-01-28T01:14:18.978940308Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 28 01:14:19.053816 sshd_keygen[1692]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Jan 28 01:14:19.070044 kernel: EXT4-fs (vda9): resized filesystem to 11516923 Jan 28 01:14:19.090276 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Jan 28 01:14:19.098558 systemd[1]: Starting issuegen.service - Generate /run/issue... Jan 28 01:14:19.117353 systemd[1]: issuegen.service: Deactivated successfully. Jan 28 01:14:19.117914 systemd[1]: Finished issuegen.service - Generate /run/issue. Jan 28 01:14:19.124353 extend-filesystems[1691]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Jan 28 01:14:19.124353 extend-filesystems[1691]: old_desc_blocks = 1, new_desc_blocks = 6 Jan 28 01:14:19.124353 extend-filesystems[1691]: The filesystem on /dev/vda9 is now 11516923 (4k) blocks long. Jan 28 01:14:19.131738 extend-filesystems[1647]: Resized filesystem in /dev/vda9 Jan 28 01:14:19.124637 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Jan 28 01:14:19.137397 containerd[1674]: time="2026-01-28T01:14:19.131840356Z" level=info msg="Start subscribing containerd event" Jan 28 01:14:19.137397 containerd[1674]: time="2026-01-28T01:14:19.131879436Z" level=info msg="Start recovering state" Jan 28 01:14:19.137397 containerd[1674]: time="2026-01-28T01:14:19.131961612Z" level=info msg="Start event monitor" Jan 28 01:14:19.137397 containerd[1674]: time="2026-01-28T01:14:19.131971328Z" level=info msg="Start cni network conf syncer for default" Jan 28 01:14:19.137397 containerd[1674]: time="2026-01-28T01:14:19.131979216Z" level=info msg="Start streaming server" Jan 28 01:14:19.137397 containerd[1674]: time="2026-01-28T01:14:19.131986725Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Jan 28 01:14:19.137397 containerd[1674]: time="2026-01-28T01:14:19.131993777Z" level=info msg="runtime interface starting up..." Jan 28 01:14:19.137397 containerd[1674]: time="2026-01-28T01:14:19.131999133Z" level=info msg="starting plugins..." Jan 28 01:14:19.137397 containerd[1674]: time="2026-01-28T01:14:19.133058319Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Jan 28 01:14:19.137397 containerd[1674]: time="2026-01-28T01:14:19.132907427Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Jan 28 01:14:19.137397 containerd[1674]: time="2026-01-28T01:14:19.133162419Z" level=info msg=serving... address=/run/containerd/containerd.sock Jan 28 01:14:19.137397 containerd[1674]: time="2026-01-28T01:14:19.133201646Z" level=info msg="containerd successfully booted in 0.237757s" Jan 28 01:14:19.129688 systemd[1]: extend-filesystems.service: Deactivated successfully. Jan 28 01:14:19.130179 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Jan 28 01:14:19.137658 systemd[1]: Started containerd.service - containerd container runtime. Jan 28 01:14:19.154962 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Jan 28 01:14:19.165425 systemd[1]: Started getty@tty1.service - Getty on tty1. Jan 28 01:14:19.168549 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Jan 28 01:14:19.169691 systemd[1]: Reached target getty.target - Login Prompts. Jan 28 01:14:19.281772 tar[1666]: linux-amd64/README.md Jan 28 01:14:19.298044 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Jan 28 01:14:19.506240 systemd-networkd[1562]: eth0: Gained IPv6LL Jan 28 01:14:19.509114 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Jan 28 01:14:19.511809 systemd[1]: Reached target network-online.target - Network is Online. Jan 28 01:14:19.516083 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 28 01:14:19.523024 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 28 01:14:19.522439 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Jan 28 01:14:19.572319 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Jan 28 01:14:19.916265 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 28 01:14:20.744955 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 28 01:14:20.752350 (kubelet)[1785]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 28 01:14:21.518176 kubelet[1785]: E0128 01:14:21.518101 1785 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 28 01:14:21.520582 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 28 01:14:21.520713 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 28 01:14:21.521798 systemd[1]: kubelet.service: Consumed 1.054s CPU time, 267.1M memory peak. Jan 28 01:14:21.544037 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 28 01:14:21.929067 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 28 01:14:25.553030 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 28 01:14:25.558406 coreos-metadata[1643]: Jan 28 01:14:25.558 WARN failed to locate config-drive, using the metadata service API instead Jan 28 01:14:25.579259 coreos-metadata[1643]: Jan 28 01:14:25.579 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Jan 28 01:14:25.940052 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 28 01:14:25.946414 coreos-metadata[1728]: Jan 28 01:14:25.946 WARN failed to locate config-drive, using the metadata service API instead Jan 28 01:14:25.958692 coreos-metadata[1728]: Jan 28 01:14:25.958 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Jan 28 01:14:26.952111 coreos-metadata[1728]: Jan 28 01:14:26.951 INFO Fetch successful Jan 28 01:14:26.952111 coreos-metadata[1728]: Jan 28 01:14:26.951 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Jan 28 01:14:27.602905 coreos-metadata[1643]: Jan 28 01:14:27.602 INFO Fetch successful Jan 28 01:14:27.602905 coreos-metadata[1643]: Jan 28 01:14:27.602 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Jan 28 01:14:29.320712 coreos-metadata[1728]: Jan 28 01:14:29.320 INFO Fetch successful Jan 28 01:14:29.323707 unknown[1728]: wrote ssh authorized keys file for user: core Jan 28 01:14:29.332439 coreos-metadata[1643]: Jan 28 01:14:29.332 INFO Fetch successful Jan 28 01:14:29.332439 coreos-metadata[1643]: Jan 28 01:14:29.332 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Jan 28 01:14:29.352299 update-ssh-keys[1802]: Updated "/home/core/.ssh/authorized_keys" Jan 28 01:14:29.353752 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Jan 28 01:14:29.355805 systemd[1]: Finished sshkeys.service. Jan 28 01:14:29.931309 coreos-metadata[1643]: Jan 28 01:14:29.931 INFO Fetch successful Jan 28 01:14:29.931309 coreos-metadata[1643]: Jan 28 01:14:29.931 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Jan 28 01:14:30.529139 coreos-metadata[1643]: Jan 28 01:14:30.529 INFO Fetch successful Jan 28 01:14:30.529139 coreos-metadata[1643]: Jan 28 01:14:30.529 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Jan 28 01:14:31.132917 coreos-metadata[1643]: Jan 28 01:14:31.132 INFO Fetch successful Jan 28 01:14:31.133191 coreos-metadata[1643]: Jan 28 01:14:31.132 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Jan 28 01:14:31.608107 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Jan 28 01:14:31.611584 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 28 01:14:31.735074 coreos-metadata[1643]: Jan 28 01:14:31.732 INFO Fetch successful Jan 28 01:14:31.765659 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Jan 28 01:14:31.766090 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Jan 28 01:14:31.796711 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 28 01:14:31.797619 systemd[1]: Reached target multi-user.target - Multi-User System. Jan 28 01:14:31.797856 systemd[1]: Startup finished in 3.546s (kernel) + 13.373s (initrd) + 16.774s (userspace) = 33.693s. Jan 28 01:14:31.810329 (kubelet)[1819]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 28 01:14:31.848301 kubelet[1819]: E0128 01:14:31.848253 1819 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 28 01:14:31.851838 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 28 01:14:31.851970 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 28 01:14:31.852523 systemd[1]: kubelet.service: Consumed 167ms CPU time, 108.7M memory peak. Jan 28 01:14:42.102550 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Jan 28 01:14:42.104453 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 28 01:14:42.244046 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 28 01:14:42.250067 chronyd[1641]: Selected source PHC0 Jan 28 01:14:42.252482 (kubelet)[1833]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 28 01:14:42.292565 kubelet[1833]: E0128 01:14:42.292518 1833 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 28 01:14:42.294726 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 28 01:14:42.294864 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 28 01:14:42.295545 systemd[1]: kubelet.service: Consumed 138ms CPU time, 107.7M memory peak. Jan 28 01:14:44.000021 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Jan 28 01:14:44.001708 systemd[1]: Started sshd@0-10.0.0.143:22-20.161.92.111:36922.service - OpenSSH per-connection server daemon (20.161.92.111:36922). Jan 28 01:14:44.625924 sshd[1840]: Accepted publickey for core from 20.161.92.111 port 36922 ssh2: RSA SHA256:u7tV79mK9Za6w/yPdRl0UmbGy3sJOXYjH/g+qt6t2hM Jan 28 01:14:44.628913 sshd-session[1840]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 01:14:44.639212 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Jan 28 01:14:44.641264 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Jan 28 01:14:44.643538 systemd-logind[1658]: New session 1 of user core. Jan 28 01:14:44.670842 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Jan 28 01:14:44.673133 systemd[1]: Starting user@500.service - User Manager for UID 500... Jan 28 01:14:44.686356 (systemd)[1846]: pam_unix(systemd-user:session): session opened for user core(uid=500) by core(uid=0) Jan 28 01:14:44.688979 systemd-logind[1658]: New session 2 of user core. Jan 28 01:14:44.815172 systemd[1846]: Queued start job for default target default.target. Jan 28 01:14:44.822054 systemd[1846]: Created slice app.slice - User Application Slice. Jan 28 01:14:44.822105 systemd[1846]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of User's Temporary Directories. Jan 28 01:14:44.822120 systemd[1846]: Reached target paths.target - Paths. Jan 28 01:14:44.822170 systemd[1846]: Reached target timers.target - Timers. Jan 28 01:14:44.823476 systemd[1846]: Starting dbus.socket - D-Bus User Message Bus Socket... Jan 28 01:14:44.826223 systemd[1846]: Starting systemd-tmpfiles-setup.service - Create User Files and Directories... Jan 28 01:14:44.836467 systemd[1846]: Listening on dbus.socket - D-Bus User Message Bus Socket. Jan 28 01:14:44.836538 systemd[1846]: Reached target sockets.target - Sockets. Jan 28 01:14:44.837769 systemd[1846]: Finished systemd-tmpfiles-setup.service - Create User Files and Directories. Jan 28 01:14:44.837859 systemd[1846]: Reached target basic.target - Basic System. Jan 28 01:14:44.837903 systemd[1846]: Reached target default.target - Main User Target. Jan 28 01:14:44.837931 systemd[1846]: Startup finished in 143ms. Jan 28 01:14:44.838110 systemd[1]: Started user@500.service - User Manager for UID 500. Jan 28 01:14:44.846400 systemd[1]: Started session-1.scope - Session 1 of User core. Jan 28 01:14:45.168929 systemd[1]: Started sshd@1-10.0.0.143:22-20.161.92.111:36924.service - OpenSSH per-connection server daemon (20.161.92.111:36924). Jan 28 01:14:45.723217 sshd[1860]: Accepted publickey for core from 20.161.92.111 port 36924 ssh2: RSA SHA256:u7tV79mK9Za6w/yPdRl0UmbGy3sJOXYjH/g+qt6t2hM Jan 28 01:14:45.724636 sshd-session[1860]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 01:14:45.730148 systemd-logind[1658]: New session 3 of user core. Jan 28 01:14:45.736258 systemd[1]: Started session-3.scope - Session 3 of User core. Jan 28 01:14:46.035994 sshd[1864]: Connection closed by 20.161.92.111 port 36924 Jan 28 01:14:46.036593 sshd-session[1860]: pam_unix(sshd:session): session closed for user core Jan 28 01:14:46.040806 systemd[1]: sshd@1-10.0.0.143:22-20.161.92.111:36924.service: Deactivated successfully. Jan 28 01:14:46.042596 systemd[1]: session-3.scope: Deactivated successfully. Jan 28 01:14:46.043359 systemd-logind[1658]: Session 3 logged out. Waiting for processes to exit. Jan 28 01:14:46.044649 systemd-logind[1658]: Removed session 3. Jan 28 01:14:46.148346 systemd[1]: Started sshd@2-10.0.0.143:22-20.161.92.111:36926.service - OpenSSH per-connection server daemon (20.161.92.111:36926). Jan 28 01:14:46.716546 sshd[1870]: Accepted publickey for core from 20.161.92.111 port 36926 ssh2: RSA SHA256:u7tV79mK9Za6w/yPdRl0UmbGy3sJOXYjH/g+qt6t2hM Jan 28 01:14:46.718223 sshd-session[1870]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 01:14:46.723997 systemd-logind[1658]: New session 4 of user core. Jan 28 01:14:46.729376 systemd[1]: Started session-4.scope - Session 4 of User core. Jan 28 01:14:47.026844 sshd[1874]: Connection closed by 20.161.92.111 port 36926 Jan 28 01:14:47.027553 sshd-session[1870]: pam_unix(sshd:session): session closed for user core Jan 28 01:14:47.033004 systemd[1]: sshd@2-10.0.0.143:22-20.161.92.111:36926.service: Deactivated successfully. Jan 28 01:14:47.035494 systemd[1]: session-4.scope: Deactivated successfully. Jan 28 01:14:47.036499 systemd-logind[1658]: Session 4 logged out. Waiting for processes to exit. Jan 28 01:14:47.038015 systemd-logind[1658]: Removed session 4. Jan 28 01:14:47.143031 systemd[1]: Started sshd@3-10.0.0.143:22-20.161.92.111:36930.service - OpenSSH per-connection server daemon (20.161.92.111:36930). Jan 28 01:14:47.724233 sshd[1880]: Accepted publickey for core from 20.161.92.111 port 36930 ssh2: RSA SHA256:u7tV79mK9Za6w/yPdRl0UmbGy3sJOXYjH/g+qt6t2hM Jan 28 01:14:47.725477 sshd-session[1880]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 01:14:47.732315 systemd-logind[1658]: New session 5 of user core. Jan 28 01:14:47.743474 systemd[1]: Started session-5.scope - Session 5 of User core. Jan 28 01:14:48.037410 sshd[1884]: Connection closed by 20.161.92.111 port 36930 Jan 28 01:14:48.038190 sshd-session[1880]: pam_unix(sshd:session): session closed for user core Jan 28 01:14:48.042775 systemd-logind[1658]: Session 5 logged out. Waiting for processes to exit. Jan 28 01:14:48.043322 systemd[1]: sshd@3-10.0.0.143:22-20.161.92.111:36930.service: Deactivated successfully. Jan 28 01:14:48.044911 systemd[1]: session-5.scope: Deactivated successfully. Jan 28 01:14:48.046506 systemd-logind[1658]: Removed session 5. Jan 28 01:14:48.150653 systemd[1]: Started sshd@4-10.0.0.143:22-20.161.92.111:36934.service - OpenSSH per-connection server daemon (20.161.92.111:36934). Jan 28 01:14:48.717143 sshd[1890]: Accepted publickey for core from 20.161.92.111 port 36934 ssh2: RSA SHA256:u7tV79mK9Za6w/yPdRl0UmbGy3sJOXYjH/g+qt6t2hM Jan 28 01:14:48.718322 sshd-session[1890]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 01:14:48.723627 systemd-logind[1658]: New session 6 of user core. Jan 28 01:14:48.730324 systemd[1]: Started session-6.scope - Session 6 of User core. Jan 28 01:14:48.954385 sudo[1895]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Jan 28 01:14:48.954864 sudo[1895]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 28 01:14:48.965342 sudo[1895]: pam_unix(sudo:session): session closed for user root Jan 28 01:14:49.069950 sshd[1894]: Connection closed by 20.161.92.111 port 36934 Jan 28 01:14:49.068406 sshd-session[1890]: pam_unix(sshd:session): session closed for user core Jan 28 01:14:49.073227 systemd[1]: sshd@4-10.0.0.143:22-20.161.92.111:36934.service: Deactivated successfully. Jan 28 01:14:49.075017 systemd[1]: session-6.scope: Deactivated successfully. Jan 28 01:14:49.076463 systemd-logind[1658]: Session 6 logged out. Waiting for processes to exit. Jan 28 01:14:49.077361 systemd-logind[1658]: Removed session 6. Jan 28 01:14:49.180932 systemd[1]: Started sshd@5-10.0.0.143:22-20.161.92.111:36950.service - OpenSSH per-connection server daemon (20.161.92.111:36950). Jan 28 01:14:49.742302 sshd[1902]: Accepted publickey for core from 20.161.92.111 port 36950 ssh2: RSA SHA256:u7tV79mK9Za6w/yPdRl0UmbGy3sJOXYjH/g+qt6t2hM Jan 28 01:14:49.744507 sshd-session[1902]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 01:14:49.751180 systemd-logind[1658]: New session 7 of user core. Jan 28 01:14:49.758420 systemd[1]: Started session-7.scope - Session 7 of User core. Jan 28 01:14:49.955180 sudo[1908]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Jan 28 01:14:49.955467 sudo[1908]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 28 01:14:49.958638 sudo[1908]: pam_unix(sudo:session): session closed for user root Jan 28 01:14:49.965067 sudo[1907]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Jan 28 01:14:49.965348 sudo[1907]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 28 01:14:49.973640 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 28 01:14:50.024000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Jan 28 01:14:50.025680 kernel: kauditd_printk_skb: 188 callbacks suppressed Jan 28 01:14:50.025743 kernel: audit: type=1305 audit(1769562890.024:234): auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Jan 28 01:14:50.024000 audit[1932]: SYSCALL arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7ffda0370cd0 a2=420 a3=0 items=0 ppid=1913 pid=1932 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:14:50.027472 augenrules[1932]: No rules Jan 28 01:14:50.030546 kernel: audit: type=1300 audit(1769562890.024:234): arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7ffda0370cd0 a2=420 a3=0 items=0 ppid=1913 pid=1932 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:14:50.024000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 28 01:14:50.032241 kernel: audit: type=1327 audit(1769562890.024:234): proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 28 01:14:50.031943 systemd[1]: audit-rules.service: Deactivated successfully. Jan 28 01:14:50.032780 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 28 01:14:50.032000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:50.034436 sudo[1907]: pam_unix(sudo:session): session closed for user root Jan 28 01:14:50.036031 kernel: audit: type=1130 audit(1769562890.032:235): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:50.032000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:50.032000 audit[1907]: USER_END pid=1907 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 28 01:14:50.040582 kernel: audit: type=1131 audit(1769562890.032:236): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:50.040636 kernel: audit: type=1106 audit(1769562890.032:237): pid=1907 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 28 01:14:50.040661 kernel: audit: type=1104 audit(1769562890.032:238): pid=1907 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 28 01:14:50.032000 audit[1907]: CRED_DISP pid=1907 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 28 01:14:50.140119 sshd[1906]: Connection closed by 20.161.92.111 port 36950 Jan 28 01:14:50.140313 sshd-session[1902]: pam_unix(sshd:session): session closed for user core Jan 28 01:14:50.141000 audit[1902]: USER_END pid=1902 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:14:50.141000 audit[1902]: CRED_DISP pid=1902 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:14:50.145662 systemd[1]: sshd@5-10.0.0.143:22-20.161.92.111:36950.service: Deactivated successfully. Jan 28 01:14:50.147953 kernel: audit: type=1106 audit(1769562890.141:239): pid=1902 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:14:50.148015 kernel: audit: type=1104 audit(1769562890.141:240): pid=1902 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:14:50.144000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@5-10.0.0.143:22-20.161.92.111:36950 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:50.148512 systemd[1]: session-7.scope: Deactivated successfully. Jan 28 01:14:50.151084 kernel: audit: type=1131 audit(1769562890.144:241): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@5-10.0.0.143:22-20.161.92.111:36950 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:50.151171 systemd-logind[1658]: Session 7 logged out. Waiting for processes to exit. Jan 28 01:14:50.152888 systemd-logind[1658]: Removed session 7. Jan 28 01:14:50.252000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-10.0.0.143:22-20.161.92.111:36952 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:50.253943 systemd[1]: Started sshd@6-10.0.0.143:22-20.161.92.111:36952.service - OpenSSH per-connection server daemon (20.161.92.111:36952). Jan 28 01:14:50.823000 audit[1941]: USER_ACCT pid=1941 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:14:50.824443 sshd[1941]: Accepted publickey for core from 20.161.92.111 port 36952 ssh2: RSA SHA256:u7tV79mK9Za6w/yPdRl0UmbGy3sJOXYjH/g+qt6t2hM Jan 28 01:14:50.824000 audit[1941]: CRED_ACQ pid=1941 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:14:50.824000 audit[1941]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc1bd18030 a2=3 a3=0 items=0 ppid=1 pid=1941 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=8 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:14:50.824000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 01:14:50.826304 sshd-session[1941]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 01:14:50.832112 systemd-logind[1658]: New session 8 of user core. Jan 28 01:14:50.842395 systemd[1]: Started session-8.scope - Session 8 of User core. Jan 28 01:14:50.844000 audit[1941]: USER_START pid=1941 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:14:50.846000 audit[1945]: CRED_ACQ pid=1945 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:14:51.035000 audit[1946]: USER_ACCT pid=1946 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_unix,pam_faillock acct="core" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 28 01:14:51.035000 audit[1946]: CRED_REFR pid=1946 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 28 01:14:51.035000 audit[1946]: USER_START pid=1946 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 28 01:14:51.035433 sudo[1946]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Jan 28 01:14:51.035762 sudo[1946]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 28 01:14:51.519081 systemd[1]: Starting docker.service - Docker Application Container Engine... Jan 28 01:14:51.536386 (dockerd)[1967]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Jan 28 01:14:51.847689 dockerd[1967]: time="2026-01-28T01:14:51.847607343Z" level=info msg="Starting up" Jan 28 01:14:51.849540 dockerd[1967]: time="2026-01-28T01:14:51.849507604Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Jan 28 01:14:51.864113 dockerd[1967]: time="2026-01-28T01:14:51.864070198Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Jan 28 01:14:51.923487 dockerd[1967]: time="2026-01-28T01:14:51.923302054Z" level=info msg="Loading containers: start." Jan 28 01:14:51.935038 kernel: Initializing XFRM netlink socket Jan 28 01:14:51.993000 audit[2015]: NETFILTER_CFG table=nat:2 family=2 entries=2 op=nft_register_chain pid=2015 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:14:51.993000 audit[2015]: SYSCALL arch=c000003e syscall=46 success=yes exit=116 a0=3 a1=7ffd61bb3c00 a2=0 a3=0 items=0 ppid=1967 pid=2015 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:14:51.993000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Jan 28 01:14:51.995000 audit[2017]: NETFILTER_CFG table=filter:3 family=2 entries=2 op=nft_register_chain pid=2017 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:14:51.995000 audit[2017]: SYSCALL arch=c000003e syscall=46 success=yes exit=124 a0=3 a1=7fff7936b120 a2=0 a3=0 items=0 ppid=1967 pid=2017 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:14:51.995000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Jan 28 01:14:51.997000 audit[2019]: NETFILTER_CFG table=filter:4 family=2 entries=1 op=nft_register_chain pid=2019 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:14:51.997000 audit[2019]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe2e747c30 a2=0 a3=0 items=0 ppid=1967 pid=2019 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:14:51.997000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Jan 28 01:14:51.998000 audit[2021]: NETFILTER_CFG table=filter:5 family=2 entries=1 op=nft_register_chain pid=2021 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:14:51.998000 audit[2021]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffdad1099c0 a2=0 a3=0 items=0 ppid=1967 pid=2021 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:14:51.998000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Jan 28 01:14:52.000000 audit[2023]: NETFILTER_CFG table=filter:6 family=2 entries=1 op=nft_register_chain pid=2023 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:14:52.000000 audit[2023]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffcfc6928e0 a2=0 a3=0 items=0 ppid=1967 pid=2023 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:14:52.000000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Jan 28 01:14:52.002000 audit[2025]: NETFILTER_CFG table=filter:7 family=2 entries=1 op=nft_register_chain pid=2025 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:14:52.002000 audit[2025]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7ffd38b4f9d0 a2=0 a3=0 items=0 ppid=1967 pid=2025 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:14:52.002000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 28 01:14:52.003000 audit[2027]: NETFILTER_CFG table=filter:8 family=2 entries=1 op=nft_register_chain pid=2027 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:14:52.003000 audit[2027]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7fffc382aac0 a2=0 a3=0 items=0 ppid=1967 pid=2027 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:14:52.003000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 28 01:14:52.005000 audit[2029]: NETFILTER_CFG table=nat:9 family=2 entries=2 op=nft_register_chain pid=2029 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:14:52.005000 audit[2029]: SYSCALL arch=c000003e syscall=46 success=yes exit=384 a0=3 a1=7ffeb53e1c80 a2=0 a3=0 items=0 ppid=1967 pid=2029 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:14:52.005000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Jan 28 01:14:52.049000 audit[2032]: NETFILTER_CFG table=nat:10 family=2 entries=2 op=nft_register_chain pid=2032 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:14:52.049000 audit[2032]: SYSCALL arch=c000003e syscall=46 success=yes exit=472 a0=3 a1=7fff8352b5c0 a2=0 a3=0 items=0 ppid=1967 pid=2032 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:14:52.049000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003132372E302E302E302F38 Jan 28 01:14:52.051000 audit[2034]: NETFILTER_CFG table=filter:11 family=2 entries=2 op=nft_register_chain pid=2034 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:14:52.051000 audit[2034]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7ffe6ddac2f0 a2=0 a3=0 items=0 ppid=1967 pid=2034 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:14:52.051000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Jan 28 01:14:52.054000 audit[2036]: NETFILTER_CFG table=filter:12 family=2 entries=1 op=nft_register_rule pid=2036 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:14:52.054000 audit[2036]: SYSCALL arch=c000003e syscall=46 success=yes exit=236 a0=3 a1=7ffccfd68ce0 a2=0 a3=0 items=0 ppid=1967 pid=2036 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:14:52.054000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Jan 28 01:14:52.056000 audit[2038]: NETFILTER_CFG table=filter:13 family=2 entries=1 op=nft_register_rule pid=2038 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:14:52.056000 audit[2038]: SYSCALL arch=c000003e syscall=46 success=yes exit=248 a0=3 a1=7ffec79fde50 a2=0 a3=0 items=0 ppid=1967 pid=2038 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:14:52.056000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 28 01:14:52.058000 audit[2040]: NETFILTER_CFG table=filter:14 family=2 entries=1 op=nft_register_rule pid=2040 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:14:52.058000 audit[2040]: SYSCALL arch=c000003e syscall=46 success=yes exit=232 a0=3 a1=7ffedb5901a0 a2=0 a3=0 items=0 ppid=1967 pid=2040 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:14:52.058000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Jan 28 01:14:52.098000 audit[2070]: NETFILTER_CFG table=nat:15 family=10 entries=2 op=nft_register_chain pid=2070 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:14:52.098000 audit[2070]: SYSCALL arch=c000003e syscall=46 success=yes exit=116 a0=3 a1=7ffcdd9a39b0 a2=0 a3=0 items=0 ppid=1967 pid=2070 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:14:52.098000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Jan 28 01:14:52.100000 audit[2072]: NETFILTER_CFG table=filter:16 family=10 entries=2 op=nft_register_chain pid=2072 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:14:52.100000 audit[2072]: SYSCALL arch=c000003e syscall=46 success=yes exit=124 a0=3 a1=7ffcc29039e0 a2=0 a3=0 items=0 ppid=1967 pid=2072 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:14:52.100000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Jan 28 01:14:52.103000 audit[2074]: NETFILTER_CFG table=filter:17 family=10 entries=1 op=nft_register_chain pid=2074 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:14:52.103000 audit[2074]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe0b819330 a2=0 a3=0 items=0 ppid=1967 pid=2074 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:14:52.103000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Jan 28 01:14:52.105000 audit[2076]: NETFILTER_CFG table=filter:18 family=10 entries=1 op=nft_register_chain pid=2076 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:14:52.105000 audit[2076]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff4711a2f0 a2=0 a3=0 items=0 ppid=1967 pid=2076 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:14:52.105000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Jan 28 01:14:52.107000 audit[2078]: NETFILTER_CFG table=filter:19 family=10 entries=1 op=nft_register_chain pid=2078 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:14:52.107000 audit[2078]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffc56a1d9b0 a2=0 a3=0 items=0 ppid=1967 pid=2078 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:14:52.107000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Jan 28 01:14:52.108000 audit[2080]: NETFILTER_CFG table=filter:20 family=10 entries=1 op=nft_register_chain pid=2080 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:14:52.108000 audit[2080]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7ffe12724090 a2=0 a3=0 items=0 ppid=1967 pid=2080 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:14:52.108000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 28 01:14:52.110000 audit[2082]: NETFILTER_CFG table=filter:21 family=10 entries=1 op=nft_register_chain pid=2082 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:14:52.110000 audit[2082]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7ffddfb16030 a2=0 a3=0 items=0 ppid=1967 pid=2082 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:14:52.110000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 28 01:14:52.113000 audit[2084]: NETFILTER_CFG table=nat:22 family=10 entries=2 op=nft_register_chain pid=2084 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:14:52.113000 audit[2084]: SYSCALL arch=c000003e syscall=46 success=yes exit=384 a0=3 a1=7ffd747c4d30 a2=0 a3=0 items=0 ppid=1967 pid=2084 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:14:52.113000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Jan 28 01:14:52.115000 audit[2086]: NETFILTER_CFG table=nat:23 family=10 entries=2 op=nft_register_chain pid=2086 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:14:52.115000 audit[2086]: SYSCALL arch=c000003e syscall=46 success=yes exit=484 a0=3 a1=7fff50f74bd0 a2=0 a3=0 items=0 ppid=1967 pid=2086 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:14:52.115000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003A3A312F313238 Jan 28 01:14:52.117000 audit[2088]: NETFILTER_CFG table=filter:24 family=10 entries=2 op=nft_register_chain pid=2088 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:14:52.117000 audit[2088]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7fff67f8ed40 a2=0 a3=0 items=0 ppid=1967 pid=2088 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:14:52.117000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Jan 28 01:14:52.119000 audit[2090]: NETFILTER_CFG table=filter:25 family=10 entries=1 op=nft_register_rule pid=2090 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:14:52.119000 audit[2090]: SYSCALL arch=c000003e syscall=46 success=yes exit=236 a0=3 a1=7fff8171b140 a2=0 a3=0 items=0 ppid=1967 pid=2090 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:14:52.119000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Jan 28 01:14:52.121000 audit[2092]: NETFILTER_CFG table=filter:26 family=10 entries=1 op=nft_register_rule pid=2092 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:14:52.121000 audit[2092]: SYSCALL arch=c000003e syscall=46 success=yes exit=248 a0=3 a1=7ffd8a1e9fe0 a2=0 a3=0 items=0 ppid=1967 pid=2092 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:14:52.121000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 28 01:14:52.123000 audit[2094]: NETFILTER_CFG table=filter:27 family=10 entries=1 op=nft_register_rule pid=2094 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:14:52.123000 audit[2094]: SYSCALL arch=c000003e syscall=46 success=yes exit=232 a0=3 a1=7ffe61f55880 a2=0 a3=0 items=0 ppid=1967 pid=2094 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:14:52.123000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Jan 28 01:14:52.128000 audit[2099]: NETFILTER_CFG table=filter:28 family=2 entries=1 op=nft_register_chain pid=2099 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:14:52.128000 audit[2099]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffcea247580 a2=0 a3=0 items=0 ppid=1967 pid=2099 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:14:52.128000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Jan 28 01:14:52.130000 audit[2101]: NETFILTER_CFG table=filter:29 family=2 entries=1 op=nft_register_rule pid=2101 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:14:52.130000 audit[2101]: SYSCALL arch=c000003e syscall=46 success=yes exit=212 a0=3 a1=7ffc9607dec0 a2=0 a3=0 items=0 ppid=1967 pid=2101 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:14:52.130000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Jan 28 01:14:52.133000 audit[2103]: NETFILTER_CFG table=filter:30 family=2 entries=1 op=nft_register_rule pid=2103 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:14:52.133000 audit[2103]: SYSCALL arch=c000003e syscall=46 success=yes exit=224 a0=3 a1=7fffcd13f9f0 a2=0 a3=0 items=0 ppid=1967 pid=2103 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:14:52.133000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Jan 28 01:14:52.135000 audit[2105]: NETFILTER_CFG table=filter:31 family=10 entries=1 op=nft_register_chain pid=2105 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:14:52.135000 audit[2105]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffc73fe57d0 a2=0 a3=0 items=0 ppid=1967 pid=2105 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:14:52.135000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Jan 28 01:14:52.137000 audit[2107]: NETFILTER_CFG table=filter:32 family=10 entries=1 op=nft_register_rule pid=2107 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:14:52.137000 audit[2107]: SYSCALL arch=c000003e syscall=46 success=yes exit=212 a0=3 a1=7ffe4bb41700 a2=0 a3=0 items=0 ppid=1967 pid=2107 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:14:52.137000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Jan 28 01:14:52.139000 audit[2109]: NETFILTER_CFG table=filter:33 family=10 entries=1 op=nft_register_rule pid=2109 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:14:52.139000 audit[2109]: SYSCALL arch=c000003e syscall=46 success=yes exit=224 a0=3 a1=7ffe196fbb40 a2=0 a3=0 items=0 ppid=1967 pid=2109 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:14:52.139000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Jan 28 01:14:52.178000 audit[2114]: NETFILTER_CFG table=nat:34 family=2 entries=2 op=nft_register_chain pid=2114 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:14:52.178000 audit[2114]: SYSCALL arch=c000003e syscall=46 success=yes exit=520 a0=3 a1=7ffdb7467a40 a2=0 a3=0 items=0 ppid=1967 pid=2114 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:14:52.178000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900504F5354524F5554494E47002D73003137322E31372E302E302F31360000002D6F00646F636B657230002D6A004D415351554552414445 Jan 28 01:14:52.180000 audit[2116]: NETFILTER_CFG table=nat:35 family=2 entries=1 op=nft_register_rule pid=2116 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:14:52.180000 audit[2116]: SYSCALL arch=c000003e syscall=46 success=yes exit=288 a0=3 a1=7ffcae7e3350 a2=0 a3=0 items=0 ppid=1967 pid=2116 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:14:52.180000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900444F434B4552002D6900646F636B657230002D6A0052455455524E Jan 28 01:14:52.187000 audit[2124]: NETFILTER_CFG table=filter:36 family=2 entries=1 op=nft_register_rule pid=2124 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:14:52.187000 audit[2124]: SYSCALL arch=c000003e syscall=46 success=yes exit=300 a0=3 a1=7ffcfc9a06e0 a2=0 a3=0 items=0 ppid=1967 pid=2124 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:14:52.187000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D464F5257415244002D6900646F636B657230002D6A00414343455054 Jan 28 01:14:52.199000 audit[2130]: NETFILTER_CFG table=filter:37 family=2 entries=1 op=nft_register_rule pid=2130 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:14:52.199000 audit[2130]: SYSCALL arch=c000003e syscall=46 success=yes exit=376 a0=3 a1=7fffd6d811e0 a2=0 a3=0 items=0 ppid=1967 pid=2130 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:14:52.199000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45520000002D6900646F636B657230002D6F00646F636B657230002D6A0044524F50 Jan 28 01:14:52.201000 audit[2132]: NETFILTER_CFG table=filter:38 family=2 entries=1 op=nft_register_rule pid=2132 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:14:52.201000 audit[2132]: SYSCALL arch=c000003e syscall=46 success=yes exit=512 a0=3 a1=7ffdc64e2a60 a2=0 a3=0 items=0 ppid=1967 pid=2132 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:14:52.201000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D4354002D6F00646F636B657230002D6D00636F6E6E747261636B002D2D637473746174650052454C415445442C45535441424C4953484544002D6A00414343455054 Jan 28 01:14:52.203000 audit[2134]: NETFILTER_CFG table=filter:39 family=2 entries=1 op=nft_register_rule pid=2134 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:14:52.203000 audit[2134]: SYSCALL arch=c000003e syscall=46 success=yes exit=312 a0=3 a1=7ffc1b3861d0 a2=0 a3=0 items=0 ppid=1967 pid=2134 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:14:52.203000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D425249444745002D6F00646F636B657230002D6A00444F434B4552 Jan 28 01:14:52.205000 audit[2136]: NETFILTER_CFG table=filter:40 family=2 entries=1 op=nft_register_rule pid=2136 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:14:52.205000 audit[2136]: SYSCALL arch=c000003e syscall=46 success=yes exit=428 a0=3 a1=7ffcdd7cd810 a2=0 a3=0 items=0 ppid=1967 pid=2136 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:14:52.205000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D49534F4C4154494F4E2D53544147452D31002D6900646F636B6572300000002D6F00646F636B657230002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 28 01:14:52.207000 audit[2138]: NETFILTER_CFG table=filter:41 family=2 entries=1 op=nft_register_rule pid=2138 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:14:52.207000 audit[2138]: SYSCALL arch=c000003e syscall=46 success=yes exit=312 a0=3 a1=7ffea81ccee0 a2=0 a3=0 items=0 ppid=1967 pid=2138 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:14:52.207000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4900444F434B45522D49534F4C4154494F4E2D53544147452D32002D6F00646F636B657230002D6A0044524F50 Jan 28 01:14:52.209478 systemd-networkd[1562]: docker0: Link UP Jan 28 01:14:52.216793 dockerd[1967]: time="2026-01-28T01:14:52.216749236Z" level=info msg="Loading containers: done." Jan 28 01:14:52.237143 dockerd[1967]: time="2026-01-28T01:14:52.236834140Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Jan 28 01:14:52.237143 dockerd[1967]: time="2026-01-28T01:14:52.236918866Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Jan 28 01:14:52.237143 dockerd[1967]: time="2026-01-28T01:14:52.236991277Z" level=info msg="Initializing buildkit" Jan 28 01:14:52.275023 dockerd[1967]: time="2026-01-28T01:14:52.274970204Z" level=info msg="Completed buildkit initialization" Jan 28 01:14:52.281677 dockerd[1967]: time="2026-01-28T01:14:52.281610413Z" level=info msg="Daemon has completed initialization" Jan 28 01:14:52.282621 dockerd[1967]: time="2026-01-28T01:14:52.281850359Z" level=info msg="API listen on /run/docker.sock" Jan 28 01:14:52.282132 systemd[1]: Started docker.service - Docker Application Container Engine. Jan 28 01:14:52.281000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=docker comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:52.333781 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Jan 28 01:14:52.336197 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 28 01:14:52.467209 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 28 01:14:52.466000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:14:52.474270 (kubelet)[2181]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 28 01:14:52.510142 kubelet[2181]: E0128 01:14:52.510090 2181 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 28 01:14:52.512731 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 28 01:14:52.512855 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 28 01:14:52.513457 systemd[1]: kubelet.service: Consumed 133ms CPU time, 108.4M memory peak. Jan 28 01:14:52.512000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 28 01:14:53.703904 containerd[1674]: time="2026-01-28T01:14:53.703821311Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.7\"" Jan 28 01:14:54.383516 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1420690552.mount: Deactivated successfully. Jan 28 01:14:55.250973 containerd[1674]: time="2026-01-28T01:14:55.250880990Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:14:55.253628 containerd[1674]: time="2026-01-28T01:14:55.253591675Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.7: active requests=0, bytes read=28445968" Jan 28 01:14:55.255256 containerd[1674]: time="2026-01-28T01:14:55.255197545Z" level=info msg="ImageCreate event name:\"sha256:021d1ceeffb11df7a9fb9adfa0ad0a30dcd13cb3d630022066f184cdcb93731b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:14:55.259242 containerd[1674]: time="2026-01-28T01:14:55.259206262Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:9585226cb85d1dc0f0ef5f7a75f04e4bc91ddd82de249533bd293aa3cf958dab\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:14:55.259998 containerd[1674]: time="2026-01-28T01:14:55.259966002Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.7\" with image id \"sha256:021d1ceeffb11df7a9fb9adfa0ad0a30dcd13cb3d630022066f184cdcb93731b\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.7\", repo digest \"registry.k8s.io/kube-apiserver@sha256:9585226cb85d1dc0f0ef5f7a75f04e4bc91ddd82de249533bd293aa3cf958dab\", size \"30111311\" in 1.555979437s" Jan 28 01:14:55.259998 containerd[1674]: time="2026-01-28T01:14:55.259996776Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.7\" returns image reference \"sha256:021d1ceeffb11df7a9fb9adfa0ad0a30dcd13cb3d630022066f184cdcb93731b\"" Jan 28 01:14:55.260815 containerd[1674]: time="2026-01-28T01:14:55.260787477Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.7\"" Jan 28 01:14:56.623858 containerd[1674]: time="2026-01-28T01:14:56.623145509Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:14:56.625549 containerd[1674]: time="2026-01-28T01:14:56.625526891Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.7: active requests=0, bytes read=26008626" Jan 28 01:14:56.627204 containerd[1674]: time="2026-01-28T01:14:56.627186643Z" level=info msg="ImageCreate event name:\"sha256:29c7cab9d8e681d047281fd3711baf13c28f66923480fb11c8f22ddb7ca742d1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:14:56.630035 containerd[1674]: time="2026-01-28T01:14:56.629998013Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:f69d77ca0626b5a4b7b432c18de0952941181db7341c80eb89731f46d1d0c230\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:14:56.630747 containerd[1674]: time="2026-01-28T01:14:56.630727001Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.7\" with image id \"sha256:29c7cab9d8e681d047281fd3711baf13c28f66923480fb11c8f22ddb7ca742d1\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.7\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:f69d77ca0626b5a4b7b432c18de0952941181db7341c80eb89731f46d1d0c230\", size \"27673815\" in 1.369915491s" Jan 28 01:14:56.630813 containerd[1674]: time="2026-01-28T01:14:56.630802582Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.7\" returns image reference \"sha256:29c7cab9d8e681d047281fd3711baf13c28f66923480fb11c8f22ddb7ca742d1\"" Jan 28 01:14:56.631272 containerd[1674]: time="2026-01-28T01:14:56.631250110Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.7\"" Jan 28 01:14:57.697951 containerd[1674]: time="2026-01-28T01:14:57.697203315Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:14:57.698512 containerd[1674]: time="2026-01-28T01:14:57.698478100Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.7: active requests=0, bytes read=0" Jan 28 01:14:57.700110 containerd[1674]: time="2026-01-28T01:14:57.700078637Z" level=info msg="ImageCreate event name:\"sha256:f457f6fcd712acb5b9beef873f6f4a4869182f9eb52ea6e24824fd4ac4eed393\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:14:57.703386 containerd[1674]: time="2026-01-28T01:14:57.703355492Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:21bda321d8b4d48eb059fbc1593203d55d8b3bc7acd0584e04e55504796d78d0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:14:57.704262 containerd[1674]: time="2026-01-28T01:14:57.704239638Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.7\" with image id \"sha256:f457f6fcd712acb5b9beef873f6f4a4869182f9eb52ea6e24824fd4ac4eed393\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.7\", repo digest \"registry.k8s.io/kube-scheduler@sha256:21bda321d8b4d48eb059fbc1593203d55d8b3bc7acd0584e04e55504796d78d0\", size \"21815154\" in 1.072964049s" Jan 28 01:14:57.704345 containerd[1674]: time="2026-01-28T01:14:57.704333296Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.7\" returns image reference \"sha256:f457f6fcd712acb5b9beef873f6f4a4869182f9eb52ea6e24824fd4ac4eed393\"" Jan 28 01:14:57.705076 containerd[1674]: time="2026-01-28T01:14:57.705056676Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.7\"" Jan 28 01:14:58.845573 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1820484875.mount: Deactivated successfully. Jan 28 01:14:59.227538 containerd[1674]: time="2026-01-28T01:14:59.227417283Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:14:59.229297 containerd[1674]: time="2026-01-28T01:14:59.229167623Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.7: active requests=0, bytes read=31926374" Jan 28 01:14:59.230707 containerd[1674]: time="2026-01-28T01:14:59.230685589Z" level=info msg="ImageCreate event name:\"sha256:0929027b17fc30cb9de279f3bdba4e130b991a1dab7978a7db2e5feb2091853c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:14:59.234934 containerd[1674]: time="2026-01-28T01:14:59.234902376Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:ec25702b19026e9c0d339bc1c3bd231435a59f28b5fccb21e1b1078a357380f5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:14:59.236159 containerd[1674]: time="2026-01-28T01:14:59.236058871Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.7\" with image id \"sha256:0929027b17fc30cb9de279f3bdba4e130b991a1dab7978a7db2e5feb2091853c\", repo tag \"registry.k8s.io/kube-proxy:v1.33.7\", repo digest \"registry.k8s.io/kube-proxy@sha256:ec25702b19026e9c0d339bc1c3bd231435a59f28b5fccb21e1b1078a357380f5\", size \"31929115\" in 1.530978508s" Jan 28 01:14:59.236159 containerd[1674]: time="2026-01-28T01:14:59.236084657Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.7\" returns image reference \"sha256:0929027b17fc30cb9de279f3bdba4e130b991a1dab7978a7db2e5feb2091853c\"" Jan 28 01:14:59.236500 containerd[1674]: time="2026-01-28T01:14:59.236471827Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" Jan 28 01:14:59.990348 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2897991413.mount: Deactivated successfully. Jan 28 01:15:00.624481 containerd[1674]: time="2026-01-28T01:15:00.624400784Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:15:00.625843 containerd[1674]: time="2026-01-28T01:15:00.625612382Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=0" Jan 28 01:15:00.627220 containerd[1674]: time="2026-01-28T01:15:00.627185369Z" level=info msg="ImageCreate event name:\"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:15:00.630596 containerd[1674]: time="2026-01-28T01:15:00.630567224Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:15:00.631395 containerd[1674]: time="2026-01-28T01:15:00.631375280Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"20939036\" in 1.39471706s" Jan 28 01:15:00.631476 containerd[1674]: time="2026-01-28T01:15:00.631465573Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\"" Jan 28 01:15:00.632140 containerd[1674]: time="2026-01-28T01:15:00.632075885Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Jan 28 01:15:01.186828 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount135969690.mount: Deactivated successfully. Jan 28 01:15:01.199099 containerd[1674]: time="2026-01-28T01:15:01.199046648Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 28 01:15:01.200808 containerd[1674]: time="2026-01-28T01:15:01.200657596Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Jan 28 01:15:01.203082 containerd[1674]: time="2026-01-28T01:15:01.203055514Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 28 01:15:01.208687 containerd[1674]: time="2026-01-28T01:15:01.208659615Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 28 01:15:01.208951 containerd[1674]: time="2026-01-28T01:15:01.208789197Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 576.689749ms" Jan 28 01:15:01.208951 containerd[1674]: time="2026-01-28T01:15:01.208818983Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Jan 28 01:15:01.210148 containerd[1674]: time="2026-01-28T01:15:01.210130620Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\"" Jan 28 01:15:01.905072 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4043208743.mount: Deactivated successfully. Jan 28 01:15:02.583666 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Jan 28 01:15:02.586530 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 28 01:15:02.772916 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 28 01:15:02.775625 kernel: kauditd_printk_skb: 134 callbacks suppressed Jan 28 01:15:02.775712 kernel: audit: type=1130 audit(1769562902.772:294): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:15:02.772000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:15:02.786365 (kubelet)[2383]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 28 01:15:02.837159 kubelet[2383]: E0128 01:15:02.834965 2383 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 28 01:15:02.838941 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 28 01:15:02.839768 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 28 01:15:02.839000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 28 01:15:02.840755 systemd[1]: kubelet.service: Consumed 157ms CPU time, 108.4M memory peak. Jan 28 01:15:02.844016 kernel: audit: type=1131 audit(1769562902.839:295): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 28 01:15:03.265808 containerd[1674]: time="2026-01-28T01:15:03.265374188Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.21-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:15:03.267029 containerd[1674]: time="2026-01-28T01:15:03.266990069Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.21-0: active requests=0, bytes read=46127678" Jan 28 01:15:03.269034 containerd[1674]: time="2026-01-28T01:15:03.268843322Z" level=info msg="ImageCreate event name:\"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:15:03.276018 containerd[1674]: time="2026-01-28T01:15:03.274527881Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:15:03.276338 containerd[1674]: time="2026-01-28T01:15:03.276310313Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.21-0\" with image id \"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\", repo tag \"registry.k8s.io/etcd:3.5.21-0\", repo digest \"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\", size \"58938593\" in 2.066089243s" Jan 28 01:15:03.276392 containerd[1674]: time="2026-01-28T01:15:03.276343184Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\" returns image reference \"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\"" Jan 28 01:15:04.118031 update_engine[1660]: I20260128 01:15:04.116047 1660 update_attempter.cc:509] Updating boot flags... Jan 28 01:15:06.607754 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 28 01:15:06.607909 systemd[1]: kubelet.service: Consumed 157ms CPU time, 108.4M memory peak. Jan 28 01:15:06.606000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:15:06.612033 kernel: audit: type=1130 audit(1769562906.606:296): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:15:06.606000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:15:06.615498 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 28 01:15:06.616023 kernel: audit: type=1131 audit(1769562906.606:297): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:15:06.643143 systemd[1]: Reload requested from client PID 2437 ('systemctl') (unit session-8.scope)... Jan 28 01:15:06.643158 systemd[1]: Reloading... Jan 28 01:15:06.738281 zram_generator::config[2479]: No configuration found. Jan 28 01:15:06.947760 systemd[1]: Reloading finished in 304 ms. Jan 28 01:15:06.977000 audit: BPF prog-id=63 op=LOAD Jan 28 01:15:06.981039 kernel: audit: type=1334 audit(1769562906.977:298): prog-id=63 op=LOAD Jan 28 01:15:06.979000 audit: BPF prog-id=58 op=UNLOAD Jan 28 01:15:06.983477 kernel: audit: type=1334 audit(1769562906.979:299): prog-id=58 op=UNLOAD Jan 28 01:15:06.983527 kernel: audit: type=1334 audit(1769562906.980:300): prog-id=64 op=LOAD Jan 28 01:15:06.980000 audit: BPF prog-id=64 op=LOAD Jan 28 01:15:06.984617 kernel: audit: type=1334 audit(1769562906.980:301): prog-id=43 op=UNLOAD Jan 28 01:15:06.980000 audit: BPF prog-id=43 op=UNLOAD Jan 28 01:15:06.985749 kernel: audit: type=1334 audit(1769562906.980:302): prog-id=65 op=LOAD Jan 28 01:15:06.980000 audit: BPF prog-id=65 op=LOAD Jan 28 01:15:06.986872 kernel: audit: type=1334 audit(1769562906.980:303): prog-id=66 op=LOAD Jan 28 01:15:06.980000 audit: BPF prog-id=66 op=LOAD Jan 28 01:15:06.980000 audit: BPF prog-id=44 op=UNLOAD Jan 28 01:15:06.980000 audit: BPF prog-id=45 op=UNLOAD Jan 28 01:15:06.983000 audit: BPF prog-id=67 op=LOAD Jan 28 01:15:06.983000 audit: BPF prog-id=59 op=UNLOAD Jan 28 01:15:06.983000 audit: BPF prog-id=68 op=LOAD Jan 28 01:15:06.983000 audit: BPF prog-id=57 op=UNLOAD Jan 28 01:15:06.984000 audit: BPF prog-id=69 op=LOAD Jan 28 01:15:06.984000 audit: BPF prog-id=52 op=UNLOAD Jan 28 01:15:06.984000 audit: BPF prog-id=70 op=LOAD Jan 28 01:15:06.984000 audit: BPF prog-id=71 op=LOAD Jan 28 01:15:06.984000 audit: BPF prog-id=53 op=UNLOAD Jan 28 01:15:06.984000 audit: BPF prog-id=54 op=UNLOAD Jan 28 01:15:06.984000 audit: BPF prog-id=72 op=LOAD Jan 28 01:15:06.984000 audit: BPF prog-id=49 op=UNLOAD Jan 28 01:15:06.984000 audit: BPF prog-id=73 op=LOAD Jan 28 01:15:06.985000 audit: BPF prog-id=74 op=LOAD Jan 28 01:15:06.985000 audit: BPF prog-id=50 op=UNLOAD Jan 28 01:15:06.985000 audit: BPF prog-id=51 op=UNLOAD Jan 28 01:15:06.986000 audit: BPF prog-id=75 op=LOAD Jan 28 01:15:06.986000 audit: BPF prog-id=46 op=UNLOAD Jan 28 01:15:06.986000 audit: BPF prog-id=76 op=LOAD Jan 28 01:15:06.986000 audit: BPF prog-id=77 op=LOAD Jan 28 01:15:06.986000 audit: BPF prog-id=47 op=UNLOAD Jan 28 01:15:06.986000 audit: BPF prog-id=48 op=UNLOAD Jan 28 01:15:06.987000 audit: BPF prog-id=78 op=LOAD Jan 28 01:15:06.987000 audit: BPF prog-id=60 op=UNLOAD Jan 28 01:15:06.987000 audit: BPF prog-id=79 op=LOAD Jan 28 01:15:06.987000 audit: BPF prog-id=80 op=LOAD Jan 28 01:15:06.987000 audit: BPF prog-id=61 op=UNLOAD Jan 28 01:15:06.987000 audit: BPF prog-id=62 op=UNLOAD Jan 28 01:15:07.004000 audit: BPF prog-id=81 op=LOAD Jan 28 01:15:07.005000 audit: BPF prog-id=82 op=LOAD Jan 28 01:15:07.005000 audit: BPF prog-id=55 op=UNLOAD Jan 28 01:15:07.005000 audit: BPF prog-id=56 op=UNLOAD Jan 28 01:15:07.019617 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Jan 28 01:15:07.019702 systemd[1]: kubelet.service: Failed with result 'signal'. Jan 28 01:15:07.020035 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 28 01:15:07.019000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 28 01:15:07.020095 systemd[1]: kubelet.service: Consumed 103ms CPU time, 98.7M memory peak. Jan 28 01:15:07.021650 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 28 01:15:07.164120 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 28 01:15:07.163000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:15:07.172238 (kubelet)[2537]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 28 01:15:07.207266 kubelet[2537]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 28 01:15:07.208033 kubelet[2537]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jan 28 01:15:07.208087 kubelet[2537]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 28 01:15:07.208324 kubelet[2537]: I0128 01:15:07.208176 2537 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 28 01:15:07.855582 kubelet[2537]: I0128 01:15:07.855542 2537 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Jan 28 01:15:07.856022 kubelet[2537]: I0128 01:15:07.855750 2537 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 28 01:15:07.856022 kubelet[2537]: I0128 01:15:07.855995 2537 server.go:956] "Client rotation is on, will bootstrap in background" Jan 28 01:15:07.898128 kubelet[2537]: I0128 01:15:07.898094 2537 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 28 01:15:07.899409 kubelet[2537]: E0128 01:15:07.899380 2537 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.0.0.143:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.0.143:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Jan 28 01:15:07.906615 kubelet[2537]: I0128 01:15:07.906599 2537 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 28 01:15:07.910124 kubelet[2537]: I0128 01:15:07.910108 2537 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 28 01:15:07.910427 kubelet[2537]: I0128 01:15:07.910400 2537 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 28 01:15:07.910642 kubelet[2537]: I0128 01:15:07.910483 2537 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4593-0-0-n-62761e1650","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 28 01:15:07.910779 kubelet[2537]: I0128 01:15:07.910771 2537 topology_manager.go:138] "Creating topology manager with none policy" Jan 28 01:15:07.910814 kubelet[2537]: I0128 01:15:07.910810 2537 container_manager_linux.go:303] "Creating device plugin manager" Jan 28 01:15:07.912405 kubelet[2537]: I0128 01:15:07.912392 2537 state_mem.go:36] "Initialized new in-memory state store" Jan 28 01:15:07.916792 kubelet[2537]: I0128 01:15:07.916774 2537 kubelet.go:480] "Attempting to sync node with API server" Jan 28 01:15:07.917106 kubelet[2537]: I0128 01:15:07.916916 2537 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 28 01:15:07.917106 kubelet[2537]: I0128 01:15:07.916949 2537 kubelet.go:386] "Adding apiserver pod source" Jan 28 01:15:07.917106 kubelet[2537]: I0128 01:15:07.916964 2537 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 28 01:15:07.925870 kubelet[2537]: E0128 01:15:07.925542 2537 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.0.0.143:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4593-0-0-n-62761e1650&limit=500&resourceVersion=0\": dial tcp 10.0.0.143:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Jan 28 01:15:07.925938 kubelet[2537]: E0128 01:15:07.925927 2537 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.0.0.143:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.143:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Jan 28 01:15:07.926316 kubelet[2537]: I0128 01:15:07.926301 2537 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Jan 28 01:15:07.926756 kubelet[2537]: I0128 01:15:07.926738 2537 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Jan 28 01:15:07.929187 kubelet[2537]: W0128 01:15:07.929160 2537 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Jan 28 01:15:07.932916 kubelet[2537]: I0128 01:15:07.932768 2537 watchdog_linux.go:99] "Systemd watchdog is not enabled" Jan 28 01:15:07.932916 kubelet[2537]: I0128 01:15:07.932814 2537 server.go:1289] "Started kubelet" Jan 28 01:15:07.935493 kubelet[2537]: I0128 01:15:07.935455 2537 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Jan 28 01:15:07.937124 kubelet[2537]: I0128 01:15:07.937104 2537 server.go:317] "Adding debug handlers to kubelet server" Jan 28 01:15:07.941607 kubelet[2537]: I0128 01:15:07.940683 2537 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 28 01:15:07.941607 kubelet[2537]: I0128 01:15:07.940940 2537 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 28 01:15:07.941905 kubelet[2537]: I0128 01:15:07.941890 2537 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 28 01:15:07.943593 kubelet[2537]: E0128 01:15:07.942026 2537 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.0.143:6443/api/v1/namespaces/default/events\": dial tcp 10.0.0.143:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4593-0-0-n-62761e1650.188ec016d70a8296 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4593-0-0-n-62761e1650,UID:ci-4593-0-0-n-62761e1650,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4593-0-0-n-62761e1650,},FirstTimestamp:2026-01-28 01:15:07.93278735 +0000 UTC m=+0.756590688,LastTimestamp:2026-01-28 01:15:07.93278735 +0000 UTC m=+0.756590688,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4593-0-0-n-62761e1650,}" Jan 28 01:15:07.944127 kubelet[2537]: I0128 01:15:07.944114 2537 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 28 01:15:07.950132 kernel: kauditd_printk_skb: 36 callbacks suppressed Jan 28 01:15:07.950201 kernel: audit: type=1325 audit(1769562907.945:340): table=mangle:42 family=2 entries=2 op=nft_register_chain pid=2552 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:15:07.945000 audit[2552]: NETFILTER_CFG table=mangle:42 family=2 entries=2 op=nft_register_chain pid=2552 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:15:07.950334 kubelet[2537]: I0128 01:15:07.948867 2537 volume_manager.go:297] "Starting Kubelet Volume Manager" Jan 28 01:15:07.950334 kubelet[2537]: E0128 01:15:07.949077 2537 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4593-0-0-n-62761e1650\" not found" Jan 28 01:15:07.950929 kubelet[2537]: E0128 01:15:07.950910 2537 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.143:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4593-0-0-n-62761e1650?timeout=10s\": dial tcp 10.0.0.143:6443: connect: connection refused" interval="200ms" Jan 28 01:15:07.951219 kubelet[2537]: I0128 01:15:07.951207 2537 factory.go:223] Registration of the systemd container factory successfully Jan 28 01:15:07.951341 kubelet[2537]: I0128 01:15:07.951329 2537 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 28 01:15:07.945000 audit[2552]: SYSCALL arch=c000003e syscall=46 success=yes exit=136 a0=3 a1=7ffcb0e0d020 a2=0 a3=0 items=0 ppid=2537 pid=2552 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:07.954228 kubelet[2537]: I0128 01:15:07.951831 2537 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Jan 28 01:15:07.954228 kubelet[2537]: I0128 01:15:07.951892 2537 reconciler.go:26] "Reconciler: start to sync state" Jan 28 01:15:07.945000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Jan 28 01:15:07.956350 kubelet[2537]: E0128 01:15:07.956337 2537 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 28 01:15:07.957111 kernel: audit: type=1300 audit(1769562907.945:340): arch=c000003e syscall=46 success=yes exit=136 a0=3 a1=7ffcb0e0d020 a2=0 a3=0 items=0 ppid=2537 pid=2552 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:07.957166 kernel: audit: type=1327 audit(1769562907.945:340): proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Jan 28 01:15:07.957250 kubelet[2537]: E0128 01:15:07.957205 2537 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.0.0.143:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.143:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Jan 28 01:15:07.958808 kubelet[2537]: I0128 01:15:07.958614 2537 factory.go:223] Registration of the containerd container factory successfully Jan 28 01:15:07.946000 audit[2553]: NETFILTER_CFG table=filter:43 family=2 entries=1 op=nft_register_chain pid=2553 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:15:07.962024 kernel: audit: type=1325 audit(1769562907.946:341): table=filter:43 family=2 entries=1 op=nft_register_chain pid=2553 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:15:07.946000 audit[2553]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffded803400 a2=0 a3=0 items=0 ppid=2537 pid=2553 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:07.966496 kubelet[2537]: I0128 01:15:07.966475 2537 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Jan 28 01:15:07.967871 kubelet[2537]: I0128 01:15:07.967514 2537 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Jan 28 01:15:07.967871 kubelet[2537]: I0128 01:15:07.967532 2537 status_manager.go:230] "Starting to sync pod status with apiserver" Jan 28 01:15:07.967871 kubelet[2537]: I0128 01:15:07.967554 2537 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jan 28 01:15:07.967871 kubelet[2537]: I0128 01:15:07.967562 2537 kubelet.go:2436] "Starting kubelet main sync loop" Jan 28 01:15:07.967871 kubelet[2537]: E0128 01:15:07.967598 2537 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 28 01:15:07.946000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Jan 28 01:15:07.972113 kernel: audit: type=1300 audit(1769562907.946:341): arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffded803400 a2=0 a3=0 items=0 ppid=2537 pid=2553 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:07.972165 kernel: audit: type=1327 audit(1769562907.946:341): proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Jan 28 01:15:07.972184 kernel: audit: type=1325 audit(1769562907.949:342): table=filter:44 family=2 entries=2 op=nft_register_chain pid=2555 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:15:07.949000 audit[2555]: NETFILTER_CFG table=filter:44 family=2 entries=2 op=nft_register_chain pid=2555 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:15:07.949000 audit[2555]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7fff761831a0 a2=0 a3=0 items=0 ppid=2537 pid=2555 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:07.976331 kernel: audit: type=1300 audit(1769562907.949:342): arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7fff761831a0 a2=0 a3=0 items=0 ppid=2537 pid=2555 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:07.976379 kubelet[2537]: E0128 01:15:07.976274 2537 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.0.0.143:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.143:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Jan 28 01:15:07.949000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 28 01:15:07.978999 kernel: audit: type=1327 audit(1769562907.949:342): proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 28 01:15:07.951000 audit[2557]: NETFILTER_CFG table=filter:45 family=2 entries=2 op=nft_register_chain pid=2557 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:15:07.982219 kernel: audit: type=1325 audit(1769562907.951:343): table=filter:45 family=2 entries=2 op=nft_register_chain pid=2557 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:15:07.951000 audit[2557]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7fff709c4e10 a2=0 a3=0 items=0 ppid=2537 pid=2557 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:07.951000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 28 01:15:07.965000 audit[2560]: NETFILTER_CFG table=filter:46 family=2 entries=1 op=nft_register_rule pid=2560 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:15:07.965000 audit[2560]: SYSCALL arch=c000003e syscall=46 success=yes exit=924 a0=3 a1=7fffcd7dc190 a2=0 a3=0 items=0 ppid=2537 pid=2560 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:07.965000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D41004B5542452D4649524557414C4C002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E7400626C6F636B20696E636F6D696E67206C6F63616C6E657420636F6E6E656374696F6E73002D2D647374003132372E302E302E302F38 Jan 28 01:15:07.966000 audit[2562]: NETFILTER_CFG table=mangle:47 family=10 entries=2 op=nft_register_chain pid=2562 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:15:07.966000 audit[2562]: SYSCALL arch=c000003e syscall=46 success=yes exit=136 a0=3 a1=7fff0f918c10 a2=0 a3=0 items=0 ppid=2537 pid=2562 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:07.966000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Jan 28 01:15:07.967000 audit[2563]: NETFILTER_CFG table=mangle:48 family=2 entries=1 op=nft_register_chain pid=2563 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:15:07.967000 audit[2563]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffd09ed2e70 a2=0 a3=0 items=0 ppid=2537 pid=2563 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:07.967000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Jan 28 01:15:07.968000 audit[2564]: NETFILTER_CFG table=nat:49 family=2 entries=1 op=nft_register_chain pid=2564 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:15:07.968000 audit[2564]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff4cceb400 a2=0 a3=0 items=0 ppid=2537 pid=2564 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:07.968000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Jan 28 01:15:07.969000 audit[2565]: NETFILTER_CFG table=filter:50 family=2 entries=1 op=nft_register_chain pid=2565 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:15:07.969000 audit[2565]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffe24805170 a2=0 a3=0 items=0 ppid=2537 pid=2565 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:07.969000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Jan 28 01:15:07.970000 audit[2566]: NETFILTER_CFG table=mangle:51 family=10 entries=1 op=nft_register_chain pid=2566 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:15:07.970000 audit[2566]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffdf4f22e40 a2=0 a3=0 items=0 ppid=2537 pid=2566 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:07.970000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Jan 28 01:15:07.971000 audit[2567]: NETFILTER_CFG table=nat:52 family=10 entries=1 op=nft_register_chain pid=2567 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:15:07.971000 audit[2567]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffee4b06370 a2=0 a3=0 items=0 ppid=2537 pid=2567 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:07.971000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Jan 28 01:15:07.972000 audit[2568]: NETFILTER_CFG table=filter:53 family=10 entries=1 op=nft_register_chain pid=2568 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:15:07.972000 audit[2568]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffcc9448f10 a2=0 a3=0 items=0 ppid=2537 pid=2568 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:07.972000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Jan 28 01:15:07.990051 kubelet[2537]: I0128 01:15:07.990033 2537 cpu_manager.go:221] "Starting CPU manager" policy="none" Jan 28 01:15:07.990163 kubelet[2537]: I0128 01:15:07.990156 2537 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jan 28 01:15:07.990205 kubelet[2537]: I0128 01:15:07.990201 2537 state_mem.go:36] "Initialized new in-memory state store" Jan 28 01:15:07.993052 kubelet[2537]: I0128 01:15:07.993035 2537 policy_none.go:49] "None policy: Start" Jan 28 01:15:07.993141 kubelet[2537]: I0128 01:15:07.993134 2537 memory_manager.go:186] "Starting memorymanager" policy="None" Jan 28 01:15:07.993178 kubelet[2537]: I0128 01:15:07.993174 2537 state_mem.go:35] "Initializing new in-memory state store" Jan 28 01:15:07.999787 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Jan 28 01:15:08.021827 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Jan 28 01:15:08.025151 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Jan 28 01:15:08.043310 kubelet[2537]: E0128 01:15:08.043280 2537 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Jan 28 01:15:08.043507 kubelet[2537]: I0128 01:15:08.043488 2537 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 28 01:15:08.043531 kubelet[2537]: I0128 01:15:08.043500 2537 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 28 01:15:08.044116 kubelet[2537]: I0128 01:15:08.044102 2537 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 28 01:15:08.046951 kubelet[2537]: E0128 01:15:08.046812 2537 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jan 28 01:15:08.046951 kubelet[2537]: E0128 01:15:08.046864 2537 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4593-0-0-n-62761e1650\" not found" Jan 28 01:15:08.080214 systemd[1]: Created slice kubepods-burstable-pod6b411eb2a0f2e7361dac8f8e13a6c7e4.slice - libcontainer container kubepods-burstable-pod6b411eb2a0f2e7361dac8f8e13a6c7e4.slice. Jan 28 01:15:08.087770 kubelet[2537]: E0128 01:15:08.087684 2537 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4593-0-0-n-62761e1650\" not found" node="ci-4593-0-0-n-62761e1650" Jan 28 01:15:08.091627 systemd[1]: Created slice kubepods-burstable-pod1d9bc333489a6a6279cb4f704c064614.slice - libcontainer container kubepods-burstable-pod1d9bc333489a6a6279cb4f704c064614.slice. Jan 28 01:15:08.102587 kubelet[2537]: E0128 01:15:08.102416 2537 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4593-0-0-n-62761e1650\" not found" node="ci-4593-0-0-n-62761e1650" Jan 28 01:15:08.106085 systemd[1]: Created slice kubepods-burstable-podbc4d8789fc87387958ceb01a7e9cae39.slice - libcontainer container kubepods-burstable-podbc4d8789fc87387958ceb01a7e9cae39.slice. Jan 28 01:15:08.109699 kubelet[2537]: E0128 01:15:08.109658 2537 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4593-0-0-n-62761e1650\" not found" node="ci-4593-0-0-n-62761e1650" Jan 28 01:15:08.145807 kubelet[2537]: I0128 01:15:08.145767 2537 kubelet_node_status.go:75] "Attempting to register node" node="ci-4593-0-0-n-62761e1650" Jan 28 01:15:08.146238 kubelet[2537]: E0128 01:15:08.146183 2537 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.143:6443/api/v1/nodes\": dial tcp 10.0.0.143:6443: connect: connection refused" node="ci-4593-0-0-n-62761e1650" Jan 28 01:15:08.151674 kubelet[2537]: E0128 01:15:08.151640 2537 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.143:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4593-0-0-n-62761e1650?timeout=10s\": dial tcp 10.0.0.143:6443: connect: connection refused" interval="400ms" Jan 28 01:15:08.253052 kubelet[2537]: I0128 01:15:08.252983 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/1d9bc333489a6a6279cb4f704c064614-k8s-certs\") pod \"kube-controller-manager-ci-4593-0-0-n-62761e1650\" (UID: \"1d9bc333489a6a6279cb4f704c064614\") " pod="kube-system/kube-controller-manager-ci-4593-0-0-n-62761e1650" Jan 28 01:15:08.253052 kubelet[2537]: I0128 01:15:08.253033 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/1d9bc333489a6a6279cb4f704c064614-kubeconfig\") pod \"kube-controller-manager-ci-4593-0-0-n-62761e1650\" (UID: \"1d9bc333489a6a6279cb4f704c064614\") " pod="kube-system/kube-controller-manager-ci-4593-0-0-n-62761e1650" Jan 28 01:15:08.253052 kubelet[2537]: I0128 01:15:08.253051 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/bc4d8789fc87387958ceb01a7e9cae39-kubeconfig\") pod \"kube-scheduler-ci-4593-0-0-n-62761e1650\" (UID: \"bc4d8789fc87387958ceb01a7e9cae39\") " pod="kube-system/kube-scheduler-ci-4593-0-0-n-62761e1650" Jan 28 01:15:08.253052 kubelet[2537]: I0128 01:15:08.253068 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/6b411eb2a0f2e7361dac8f8e13a6c7e4-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4593-0-0-n-62761e1650\" (UID: \"6b411eb2a0f2e7361dac8f8e13a6c7e4\") " pod="kube-system/kube-apiserver-ci-4593-0-0-n-62761e1650" Jan 28 01:15:08.253779 kubelet[2537]: I0128 01:15:08.253084 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/1d9bc333489a6a6279cb4f704c064614-ca-certs\") pod \"kube-controller-manager-ci-4593-0-0-n-62761e1650\" (UID: \"1d9bc333489a6a6279cb4f704c064614\") " pod="kube-system/kube-controller-manager-ci-4593-0-0-n-62761e1650" Jan 28 01:15:08.253779 kubelet[2537]: I0128 01:15:08.253117 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/1d9bc333489a6a6279cb4f704c064614-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4593-0-0-n-62761e1650\" (UID: \"1d9bc333489a6a6279cb4f704c064614\") " pod="kube-system/kube-controller-manager-ci-4593-0-0-n-62761e1650" Jan 28 01:15:08.253779 kubelet[2537]: I0128 01:15:08.253136 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/6b411eb2a0f2e7361dac8f8e13a6c7e4-ca-certs\") pod \"kube-apiserver-ci-4593-0-0-n-62761e1650\" (UID: \"6b411eb2a0f2e7361dac8f8e13a6c7e4\") " pod="kube-system/kube-apiserver-ci-4593-0-0-n-62761e1650" Jan 28 01:15:08.253779 kubelet[2537]: I0128 01:15:08.253157 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/6b411eb2a0f2e7361dac8f8e13a6c7e4-k8s-certs\") pod \"kube-apiserver-ci-4593-0-0-n-62761e1650\" (UID: \"6b411eb2a0f2e7361dac8f8e13a6c7e4\") " pod="kube-system/kube-apiserver-ci-4593-0-0-n-62761e1650" Jan 28 01:15:08.253779 kubelet[2537]: I0128 01:15:08.253171 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/1d9bc333489a6a6279cb4f704c064614-flexvolume-dir\") pod \"kube-controller-manager-ci-4593-0-0-n-62761e1650\" (UID: \"1d9bc333489a6a6279cb4f704c064614\") " pod="kube-system/kube-controller-manager-ci-4593-0-0-n-62761e1650" Jan 28 01:15:08.349587 kubelet[2537]: I0128 01:15:08.349134 2537 kubelet_node_status.go:75] "Attempting to register node" node="ci-4593-0-0-n-62761e1650" Jan 28 01:15:08.349587 kubelet[2537]: E0128 01:15:08.349546 2537 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.143:6443/api/v1/nodes\": dial tcp 10.0.0.143:6443: connect: connection refused" node="ci-4593-0-0-n-62761e1650" Jan 28 01:15:08.390113 containerd[1674]: time="2026-01-28T01:15:08.389984885Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4593-0-0-n-62761e1650,Uid:6b411eb2a0f2e7361dac8f8e13a6c7e4,Namespace:kube-system,Attempt:0,}" Jan 28 01:15:08.403875 containerd[1674]: time="2026-01-28T01:15:08.403828943Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4593-0-0-n-62761e1650,Uid:1d9bc333489a6a6279cb4f704c064614,Namespace:kube-system,Attempt:0,}" Jan 28 01:15:08.410920 containerd[1674]: time="2026-01-28T01:15:08.410881338Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4593-0-0-n-62761e1650,Uid:bc4d8789fc87387958ceb01a7e9cae39,Namespace:kube-system,Attempt:0,}" Jan 28 01:15:08.428308 containerd[1674]: time="2026-01-28T01:15:08.428254749Z" level=info msg="connecting to shim a4c671a3f6e5d162760c93f92377b4b4b277ce9a29f70b1d2f9a21da6f076bb1" address="unix:///run/containerd/s/0100826bb50b0c179459102f3d36861234afd6403cba869a25722efa94b7c83e" namespace=k8s.io protocol=ttrpc version=3 Jan 28 01:15:08.439175 containerd[1674]: time="2026-01-28T01:15:08.439111559Z" level=info msg="connecting to shim 6f1059959f18e2bd6d121d76702a30dbac1942f0edec9d402362c5c6b49b4aec" address="unix:///run/containerd/s/d5627cad212e84b5a23d4c3a68cb2084981c2fdad63af2ba006e7f2d29682b66" namespace=k8s.io protocol=ttrpc version=3 Jan 28 01:15:08.472227 systemd[1]: Started cri-containerd-a4c671a3f6e5d162760c93f92377b4b4b277ce9a29f70b1d2f9a21da6f076bb1.scope - libcontainer container a4c671a3f6e5d162760c93f92377b4b4b277ce9a29f70b1d2f9a21da6f076bb1. Jan 28 01:15:08.487000 audit: BPF prog-id=83 op=LOAD Jan 28 01:15:08.488000 audit: BPF prog-id=84 op=LOAD Jan 28 01:15:08.488000 audit[2610]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=2582 pid=2610 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:08.488000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6134633637316133663665356431363237363063393366393233373762 Jan 28 01:15:08.488000 audit: BPF prog-id=84 op=UNLOAD Jan 28 01:15:08.488000 audit[2610]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2582 pid=2610 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:08.488000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6134633637316133663665356431363237363063393366393233373762 Jan 28 01:15:08.488000 audit: BPF prog-id=85 op=LOAD Jan 28 01:15:08.488000 audit[2610]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=2582 pid=2610 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:08.488000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6134633637316133663665356431363237363063393366393233373762 Jan 28 01:15:08.489000 audit: BPF prog-id=86 op=LOAD Jan 28 01:15:08.489000 audit[2610]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=2582 pid=2610 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:08.489000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6134633637316133663665356431363237363063393366393233373762 Jan 28 01:15:08.489000 audit: BPF prog-id=86 op=UNLOAD Jan 28 01:15:08.489000 audit[2610]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2582 pid=2610 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:08.489000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6134633637316133663665356431363237363063393366393233373762 Jan 28 01:15:08.489000 audit: BPF prog-id=85 op=UNLOAD Jan 28 01:15:08.489000 audit[2610]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2582 pid=2610 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:08.489000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6134633637316133663665356431363237363063393366393233373762 Jan 28 01:15:08.490491 containerd[1674]: time="2026-01-28T01:15:08.489761247Z" level=info msg="connecting to shim dd34bf28c83eb44ad4e782281d36d86b686f604095d4b27294bf6684fec53017" address="unix:///run/containerd/s/ae9a98da75e7cec6b1b8d39db82ff38110df4869442ab243b21c3737e64414d5" namespace=k8s.io protocol=ttrpc version=3 Jan 28 01:15:08.489000 audit: BPF prog-id=87 op=LOAD Jan 28 01:15:08.489000 audit[2610]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=2582 pid=2610 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:08.489000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6134633637316133663665356431363237363063393366393233373762 Jan 28 01:15:08.499208 systemd[1]: Started cri-containerd-6f1059959f18e2bd6d121d76702a30dbac1942f0edec9d402362c5c6b49b4aec.scope - libcontainer container 6f1059959f18e2bd6d121d76702a30dbac1942f0edec9d402362c5c6b49b4aec. Jan 28 01:15:08.518000 audit: BPF prog-id=88 op=LOAD Jan 28 01:15:08.518000 audit: BPF prog-id=89 op=LOAD Jan 28 01:15:08.518000 audit[2625]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000128238 a2=98 a3=0 items=0 ppid=2594 pid=2625 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:08.518000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3666313035393935396631386532626436643132316437363730326133 Jan 28 01:15:08.518000 audit: BPF prog-id=89 op=UNLOAD Jan 28 01:15:08.518000 audit[2625]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2594 pid=2625 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:08.518000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3666313035393935396631386532626436643132316437363730326133 Jan 28 01:15:08.519000 audit: BPF prog-id=90 op=LOAD Jan 28 01:15:08.519000 audit[2625]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000128488 a2=98 a3=0 items=0 ppid=2594 pid=2625 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:08.519000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3666313035393935396631386532626436643132316437363730326133 Jan 28 01:15:08.519000 audit: BPF prog-id=91 op=LOAD Jan 28 01:15:08.519000 audit[2625]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000128218 a2=98 a3=0 items=0 ppid=2594 pid=2625 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:08.519000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3666313035393935396631386532626436643132316437363730326133 Jan 28 01:15:08.519000 audit: BPF prog-id=91 op=UNLOAD Jan 28 01:15:08.519000 audit[2625]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2594 pid=2625 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:08.519000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3666313035393935396631386532626436643132316437363730326133 Jan 28 01:15:08.519000 audit: BPF prog-id=90 op=UNLOAD Jan 28 01:15:08.519000 audit[2625]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2594 pid=2625 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:08.519000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3666313035393935396631386532626436643132316437363730326133 Jan 28 01:15:08.519000 audit: BPF prog-id=92 op=LOAD Jan 28 01:15:08.519000 audit[2625]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001286e8 a2=98 a3=0 items=0 ppid=2594 pid=2625 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:08.519000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3666313035393935396631386532626436643132316437363730326133 Jan 28 01:15:08.525245 systemd[1]: Started cri-containerd-dd34bf28c83eb44ad4e782281d36d86b686f604095d4b27294bf6684fec53017.scope - libcontainer container dd34bf28c83eb44ad4e782281d36d86b686f604095d4b27294bf6684fec53017. Jan 28 01:15:08.539174 containerd[1674]: time="2026-01-28T01:15:08.539130895Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4593-0-0-n-62761e1650,Uid:6b411eb2a0f2e7361dac8f8e13a6c7e4,Namespace:kube-system,Attempt:0,} returns sandbox id \"a4c671a3f6e5d162760c93f92377b4b4b277ce9a29f70b1d2f9a21da6f076bb1\"" Jan 28 01:15:08.545000 audit: BPF prog-id=93 op=LOAD Jan 28 01:15:08.549180 containerd[1674]: time="2026-01-28T01:15:08.548359508Z" level=info msg="CreateContainer within sandbox \"a4c671a3f6e5d162760c93f92377b4b4b277ce9a29f70b1d2f9a21da6f076bb1\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Jan 28 01:15:08.548000 audit: BPF prog-id=94 op=LOAD Jan 28 01:15:08.548000 audit[2665]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000106238 a2=98 a3=0 items=0 ppid=2641 pid=2665 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:08.548000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6464333462663238633833656234346164346537383232383164333664 Jan 28 01:15:08.548000 audit: BPF prog-id=94 op=UNLOAD Jan 28 01:15:08.548000 audit[2665]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2641 pid=2665 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:08.548000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6464333462663238633833656234346164346537383232383164333664 Jan 28 01:15:08.548000 audit: BPF prog-id=95 op=LOAD Jan 28 01:15:08.548000 audit[2665]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000106488 a2=98 a3=0 items=0 ppid=2641 pid=2665 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:08.548000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6464333462663238633833656234346164346537383232383164333664 Jan 28 01:15:08.548000 audit: BPF prog-id=96 op=LOAD Jan 28 01:15:08.548000 audit[2665]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000106218 a2=98 a3=0 items=0 ppid=2641 pid=2665 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:08.548000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6464333462663238633833656234346164346537383232383164333664 Jan 28 01:15:08.548000 audit: BPF prog-id=96 op=UNLOAD Jan 28 01:15:08.548000 audit[2665]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2641 pid=2665 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:08.548000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6464333462663238633833656234346164346537383232383164333664 Jan 28 01:15:08.548000 audit: BPF prog-id=95 op=UNLOAD Jan 28 01:15:08.548000 audit[2665]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2641 pid=2665 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:08.548000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6464333462663238633833656234346164346537383232383164333664 Jan 28 01:15:08.548000 audit: BPF prog-id=97 op=LOAD Jan 28 01:15:08.548000 audit[2665]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001066e8 a2=98 a3=0 items=0 ppid=2641 pid=2665 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:08.548000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6464333462663238633833656234346164346537383232383164333664 Jan 28 01:15:08.552729 kubelet[2537]: E0128 01:15:08.552673 2537 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.143:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4593-0-0-n-62761e1650?timeout=10s\": dial tcp 10.0.0.143:6443: connect: connection refused" interval="800ms" Jan 28 01:15:08.561253 containerd[1674]: time="2026-01-28T01:15:08.561213355Z" level=info msg="Container 4a3f0473acec8d174e9c2270090b9b3ee89f8cbe506ed06db368d2d88c0b7d86: CDI devices from CRI Config.CDIDevices: []" Jan 28 01:15:08.575837 containerd[1674]: time="2026-01-28T01:15:08.575627117Z" level=info msg="CreateContainer within sandbox \"a4c671a3f6e5d162760c93f92377b4b4b277ce9a29f70b1d2f9a21da6f076bb1\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"4a3f0473acec8d174e9c2270090b9b3ee89f8cbe506ed06db368d2d88c0b7d86\"" Jan 28 01:15:08.577047 containerd[1674]: time="2026-01-28T01:15:08.576212110Z" level=info msg="StartContainer for \"4a3f0473acec8d174e9c2270090b9b3ee89f8cbe506ed06db368d2d88c0b7d86\"" Jan 28 01:15:08.579359 containerd[1674]: time="2026-01-28T01:15:08.579335799Z" level=info msg="connecting to shim 4a3f0473acec8d174e9c2270090b9b3ee89f8cbe506ed06db368d2d88c0b7d86" address="unix:///run/containerd/s/0100826bb50b0c179459102f3d36861234afd6403cba869a25722efa94b7c83e" protocol=ttrpc version=3 Jan 28 01:15:08.595456 containerd[1674]: time="2026-01-28T01:15:08.595415811Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4593-0-0-n-62761e1650,Uid:1d9bc333489a6a6279cb4f704c064614,Namespace:kube-system,Attempt:0,} returns sandbox id \"6f1059959f18e2bd6d121d76702a30dbac1942f0edec9d402362c5c6b49b4aec\"" Jan 28 01:15:08.600361 systemd[1]: Started cri-containerd-4a3f0473acec8d174e9c2270090b9b3ee89f8cbe506ed06db368d2d88c0b7d86.scope - libcontainer container 4a3f0473acec8d174e9c2270090b9b3ee89f8cbe506ed06db368d2d88c0b7d86. Jan 28 01:15:08.603411 containerd[1674]: time="2026-01-28T01:15:08.603157995Z" level=info msg="CreateContainer within sandbox \"6f1059959f18e2bd6d121d76702a30dbac1942f0edec9d402362c5c6b49b4aec\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Jan 28 01:15:08.612328 containerd[1674]: time="2026-01-28T01:15:08.612289942Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4593-0-0-n-62761e1650,Uid:bc4d8789fc87387958ceb01a7e9cae39,Namespace:kube-system,Attempt:0,} returns sandbox id \"dd34bf28c83eb44ad4e782281d36d86b686f604095d4b27294bf6684fec53017\"" Jan 28 01:15:08.615605 containerd[1674]: time="2026-01-28T01:15:08.615491874Z" level=info msg="Container 1a932d77c96795e154c74d3263506db2b3265230dee8cdeb3e8525c61bb9d3f2: CDI devices from CRI Config.CDIDevices: []" Jan 28 01:15:08.616000 audit: BPF prog-id=98 op=LOAD Jan 28 01:15:08.617652 containerd[1674]: time="2026-01-28T01:15:08.617578070Z" level=info msg="CreateContainer within sandbox \"dd34bf28c83eb44ad4e782281d36d86b686f604095d4b27294bf6684fec53017\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Jan 28 01:15:08.617000 audit: BPF prog-id=99 op=LOAD Jan 28 01:15:08.617000 audit[2698]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a8238 a2=98 a3=0 items=0 ppid=2582 pid=2698 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:08.617000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3461336630343733616365633864313734653963323237303039306239 Jan 28 01:15:08.617000 audit: BPF prog-id=99 op=UNLOAD Jan 28 01:15:08.617000 audit[2698]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2582 pid=2698 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:08.617000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3461336630343733616365633864313734653963323237303039306239 Jan 28 01:15:08.617000 audit: BPF prog-id=100 op=LOAD Jan 28 01:15:08.617000 audit[2698]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a8488 a2=98 a3=0 items=0 ppid=2582 pid=2698 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:08.617000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3461336630343733616365633864313734653963323237303039306239 Jan 28 01:15:08.617000 audit: BPF prog-id=101 op=LOAD Jan 28 01:15:08.617000 audit[2698]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001a8218 a2=98 a3=0 items=0 ppid=2582 pid=2698 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:08.617000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3461336630343733616365633864313734653963323237303039306239 Jan 28 01:15:08.617000 audit: BPF prog-id=101 op=UNLOAD Jan 28 01:15:08.617000 audit[2698]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2582 pid=2698 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:08.617000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3461336630343733616365633864313734653963323237303039306239 Jan 28 01:15:08.617000 audit: BPF prog-id=100 op=UNLOAD Jan 28 01:15:08.617000 audit[2698]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2582 pid=2698 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:08.617000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3461336630343733616365633864313734653963323237303039306239 Jan 28 01:15:08.617000 audit: BPF prog-id=102 op=LOAD Jan 28 01:15:08.617000 audit[2698]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a86e8 a2=98 a3=0 items=0 ppid=2582 pid=2698 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:08.617000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3461336630343733616365633864313734653963323237303039306239 Jan 28 01:15:08.628627 containerd[1674]: time="2026-01-28T01:15:08.628580848Z" level=info msg="CreateContainer within sandbox \"6f1059959f18e2bd6d121d76702a30dbac1942f0edec9d402362c5c6b49b4aec\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"1a932d77c96795e154c74d3263506db2b3265230dee8cdeb3e8525c61bb9d3f2\"" Jan 28 01:15:08.629482 containerd[1674]: time="2026-01-28T01:15:08.629449286Z" level=info msg="StartContainer for \"1a932d77c96795e154c74d3263506db2b3265230dee8cdeb3e8525c61bb9d3f2\"" Jan 28 01:15:08.630978 containerd[1674]: time="2026-01-28T01:15:08.630947227Z" level=info msg="connecting to shim 1a932d77c96795e154c74d3263506db2b3265230dee8cdeb3e8525c61bb9d3f2" address="unix:///run/containerd/s/d5627cad212e84b5a23d4c3a68cb2084981c2fdad63af2ba006e7f2d29682b66" protocol=ttrpc version=3 Jan 28 01:15:08.633975 containerd[1674]: time="2026-01-28T01:15:08.633951507Z" level=info msg="Container 3d2963883752648312d4707aec27a87dbc756e5bf172de762e5ce01812457d01: CDI devices from CRI Config.CDIDevices: []" Jan 28 01:15:08.648233 containerd[1674]: time="2026-01-28T01:15:08.648051961Z" level=info msg="CreateContainer within sandbox \"dd34bf28c83eb44ad4e782281d36d86b686f604095d4b27294bf6684fec53017\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"3d2963883752648312d4707aec27a87dbc756e5bf172de762e5ce01812457d01\"" Jan 28 01:15:08.651407 containerd[1674]: time="2026-01-28T01:15:08.651132665Z" level=info msg="StartContainer for \"3d2963883752648312d4707aec27a87dbc756e5bf172de762e5ce01812457d01\"" Jan 28 01:15:08.655128 containerd[1674]: time="2026-01-28T01:15:08.655097604Z" level=info msg="connecting to shim 3d2963883752648312d4707aec27a87dbc756e5bf172de762e5ce01812457d01" address="unix:///run/containerd/s/ae9a98da75e7cec6b1b8d39db82ff38110df4869442ab243b21c3737e64414d5" protocol=ttrpc version=3 Jan 28 01:15:08.658231 systemd[1]: Started cri-containerd-1a932d77c96795e154c74d3263506db2b3265230dee8cdeb3e8525c61bb9d3f2.scope - libcontainer container 1a932d77c96795e154c74d3263506db2b3265230dee8cdeb3e8525c61bb9d3f2. Jan 28 01:15:08.673871 containerd[1674]: time="2026-01-28T01:15:08.673754977Z" level=info msg="StartContainer for \"4a3f0473acec8d174e9c2270090b9b3ee89f8cbe506ed06db368d2d88c0b7d86\" returns successfully" Jan 28 01:15:08.684342 systemd[1]: Started cri-containerd-3d2963883752648312d4707aec27a87dbc756e5bf172de762e5ce01812457d01.scope - libcontainer container 3d2963883752648312d4707aec27a87dbc756e5bf172de762e5ce01812457d01. Jan 28 01:15:08.690000 audit: BPF prog-id=103 op=LOAD Jan 28 01:15:08.691000 audit: BPF prog-id=104 op=LOAD Jan 28 01:15:08.691000 audit[2731]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=2594 pid=2731 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:08.691000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3161393332643737633936373935653135346337346433323633353036 Jan 28 01:15:08.691000 audit: BPF prog-id=104 op=UNLOAD Jan 28 01:15:08.691000 audit[2731]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2594 pid=2731 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:08.691000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3161393332643737633936373935653135346337346433323633353036 Jan 28 01:15:08.691000 audit: BPF prog-id=105 op=LOAD Jan 28 01:15:08.691000 audit[2731]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=2594 pid=2731 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:08.691000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3161393332643737633936373935653135346337346433323633353036 Jan 28 01:15:08.691000 audit: BPF prog-id=106 op=LOAD Jan 28 01:15:08.691000 audit[2731]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=2594 pid=2731 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:08.691000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3161393332643737633936373935653135346337346433323633353036 Jan 28 01:15:08.691000 audit: BPF prog-id=106 op=UNLOAD Jan 28 01:15:08.691000 audit[2731]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2594 pid=2731 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:08.691000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3161393332643737633936373935653135346337346433323633353036 Jan 28 01:15:08.691000 audit: BPF prog-id=105 op=UNLOAD Jan 28 01:15:08.691000 audit[2731]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2594 pid=2731 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:08.691000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3161393332643737633936373935653135346337346433323633353036 Jan 28 01:15:08.691000 audit: BPF prog-id=107 op=LOAD Jan 28 01:15:08.691000 audit[2731]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=2594 pid=2731 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:08.691000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3161393332643737633936373935653135346337346433323633353036 Jan 28 01:15:08.712000 audit: BPF prog-id=108 op=LOAD Jan 28 01:15:08.714000 audit: BPF prog-id=109 op=LOAD Jan 28 01:15:08.714000 audit[2749]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a0238 a2=98 a3=0 items=0 ppid=2641 pid=2749 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:08.714000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3364323936333838333735323634383331326434373037616563323761 Jan 28 01:15:08.714000 audit: BPF prog-id=109 op=UNLOAD Jan 28 01:15:08.714000 audit[2749]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2641 pid=2749 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:08.714000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3364323936333838333735323634383331326434373037616563323761 Jan 28 01:15:08.714000 audit: BPF prog-id=110 op=LOAD Jan 28 01:15:08.714000 audit[2749]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a0488 a2=98 a3=0 items=0 ppid=2641 pid=2749 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:08.714000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3364323936333838333735323634383331326434373037616563323761 Jan 28 01:15:08.715000 audit: BPF prog-id=111 op=LOAD Jan 28 01:15:08.715000 audit[2749]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001a0218 a2=98 a3=0 items=0 ppid=2641 pid=2749 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:08.715000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3364323936333838333735323634383331326434373037616563323761 Jan 28 01:15:08.715000 audit: BPF prog-id=111 op=UNLOAD Jan 28 01:15:08.715000 audit[2749]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2641 pid=2749 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:08.715000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3364323936333838333735323634383331326434373037616563323761 Jan 28 01:15:08.715000 audit: BPF prog-id=110 op=UNLOAD Jan 28 01:15:08.715000 audit[2749]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2641 pid=2749 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:08.715000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3364323936333838333735323634383331326434373037616563323761 Jan 28 01:15:08.715000 audit: BPF prog-id=112 op=LOAD Jan 28 01:15:08.715000 audit[2749]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a06e8 a2=98 a3=0 items=0 ppid=2641 pid=2749 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:08.715000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3364323936333838333735323634383331326434373037616563323761 Jan 28 01:15:08.753369 kubelet[2537]: I0128 01:15:08.753320 2537 kubelet_node_status.go:75] "Attempting to register node" node="ci-4593-0-0-n-62761e1650" Jan 28 01:15:08.754971 containerd[1674]: time="2026-01-28T01:15:08.754541873Z" level=info msg="StartContainer for \"1a932d77c96795e154c74d3263506db2b3265230dee8cdeb3e8525c61bb9d3f2\" returns successfully" Jan 28 01:15:08.786808 containerd[1674]: time="2026-01-28T01:15:08.786700316Z" level=info msg="StartContainer for \"3d2963883752648312d4707aec27a87dbc756e5bf172de762e5ce01812457d01\" returns successfully" Jan 28 01:15:08.990039 kubelet[2537]: E0128 01:15:08.988640 2537 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4593-0-0-n-62761e1650\" not found" node="ci-4593-0-0-n-62761e1650" Jan 28 01:15:08.995101 kubelet[2537]: E0128 01:15:08.994924 2537 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4593-0-0-n-62761e1650\" not found" node="ci-4593-0-0-n-62761e1650" Jan 28 01:15:08.995877 kubelet[2537]: E0128 01:15:08.995861 2537 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4593-0-0-n-62761e1650\" not found" node="ci-4593-0-0-n-62761e1650" Jan 28 01:15:09.776103 kubelet[2537]: E0128 01:15:09.775389 2537 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4593-0-0-n-62761e1650\" not found" node="ci-4593-0-0-n-62761e1650" Jan 28 01:15:09.854530 kubelet[2537]: I0128 01:15:09.854473 2537 kubelet_node_status.go:78] "Successfully registered node" node="ci-4593-0-0-n-62761e1650" Jan 28 01:15:09.855180 kubelet[2537]: E0128 01:15:09.855055 2537 kubelet_node_status.go:548] "Error updating node status, will retry" err="error getting node \"ci-4593-0-0-n-62761e1650\": node \"ci-4593-0-0-n-62761e1650\" not found" Jan 28 01:15:09.927671 kubelet[2537]: I0128 01:15:09.927432 2537 apiserver.go:52] "Watching apiserver" Jan 28 01:15:09.950164 kubelet[2537]: I0128 01:15:09.950124 2537 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4593-0-0-n-62761e1650" Jan 28 01:15:09.953095 kubelet[2537]: I0128 01:15:09.953060 2537 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Jan 28 01:15:09.958191 kubelet[2537]: E0128 01:15:09.958161 2537 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4593-0-0-n-62761e1650\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4593-0-0-n-62761e1650" Jan 28 01:15:09.958191 kubelet[2537]: I0128 01:15:09.958190 2537 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4593-0-0-n-62761e1650" Jan 28 01:15:09.960157 kubelet[2537]: E0128 01:15:09.960141 2537 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4593-0-0-n-62761e1650\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4593-0-0-n-62761e1650" Jan 28 01:15:09.960190 kubelet[2537]: I0128 01:15:09.960160 2537 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4593-0-0-n-62761e1650" Jan 28 01:15:09.961744 kubelet[2537]: E0128 01:15:09.961725 2537 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4593-0-0-n-62761e1650\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4593-0-0-n-62761e1650" Jan 28 01:15:09.996844 kubelet[2537]: I0128 01:15:09.996801 2537 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4593-0-0-n-62761e1650" Jan 28 01:15:09.997276 kubelet[2537]: I0128 01:15:09.997120 2537 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4593-0-0-n-62761e1650" Jan 28 01:15:09.999647 kubelet[2537]: E0128 01:15:09.999622 2537 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4593-0-0-n-62761e1650\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4593-0-0-n-62761e1650" Jan 28 01:15:09.999723 kubelet[2537]: E0128 01:15:09.999623 2537 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4593-0-0-n-62761e1650\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4593-0-0-n-62761e1650" Jan 28 01:15:12.146028 systemd[1]: Reload requested from client PID 2813 ('systemctl') (unit session-8.scope)... Jan 28 01:15:12.146045 systemd[1]: Reloading... Jan 28 01:15:12.226037 zram_generator::config[2859]: No configuration found. Jan 28 01:15:12.438696 systemd[1]: Reloading finished in 292 ms. Jan 28 01:15:12.460840 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jan 28 01:15:12.461538 kubelet[2537]: I0128 01:15:12.460993 2537 dynamic_cafile_content.go:175] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 28 01:15:12.475397 systemd[1]: kubelet.service: Deactivated successfully. Jan 28 01:15:12.475661 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 28 01:15:12.474000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:15:12.475723 systemd[1]: kubelet.service: Consumed 1.064s CPU time, 130.1M memory peak. Jan 28 01:15:12.478097 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 28 01:15:12.478000 audit: BPF prog-id=113 op=LOAD Jan 28 01:15:12.478000 audit: BPF prog-id=63 op=UNLOAD Jan 28 01:15:12.479000 audit: BPF prog-id=114 op=LOAD Jan 28 01:15:12.479000 audit: BPF prog-id=68 op=UNLOAD Jan 28 01:15:12.480000 audit: BPF prog-id=115 op=LOAD Jan 28 01:15:12.480000 audit: BPF prog-id=78 op=UNLOAD Jan 28 01:15:12.480000 audit: BPF prog-id=116 op=LOAD Jan 28 01:15:12.480000 audit: BPF prog-id=117 op=LOAD Jan 28 01:15:12.480000 audit: BPF prog-id=79 op=UNLOAD Jan 28 01:15:12.480000 audit: BPF prog-id=80 op=UNLOAD Jan 28 01:15:12.481000 audit: BPF prog-id=118 op=LOAD Jan 28 01:15:12.481000 audit: BPF prog-id=72 op=UNLOAD Jan 28 01:15:12.481000 audit: BPF prog-id=119 op=LOAD Jan 28 01:15:12.481000 audit: BPF prog-id=120 op=LOAD Jan 28 01:15:12.481000 audit: BPF prog-id=73 op=UNLOAD Jan 28 01:15:12.481000 audit: BPF prog-id=74 op=UNLOAD Jan 28 01:15:12.482000 audit: BPF prog-id=121 op=LOAD Jan 28 01:15:12.482000 audit: BPF prog-id=75 op=UNLOAD Jan 28 01:15:12.482000 audit: BPF prog-id=122 op=LOAD Jan 28 01:15:12.482000 audit: BPF prog-id=123 op=LOAD Jan 28 01:15:12.482000 audit: BPF prog-id=76 op=UNLOAD Jan 28 01:15:12.482000 audit: BPF prog-id=77 op=UNLOAD Jan 28 01:15:12.486000 audit: BPF prog-id=124 op=LOAD Jan 28 01:15:12.486000 audit: BPF prog-id=125 op=LOAD Jan 28 01:15:12.486000 audit: BPF prog-id=81 op=UNLOAD Jan 28 01:15:12.486000 audit: BPF prog-id=82 op=UNLOAD Jan 28 01:15:12.487000 audit: BPF prog-id=126 op=LOAD Jan 28 01:15:12.487000 audit: BPF prog-id=67 op=UNLOAD Jan 28 01:15:12.487000 audit: BPF prog-id=127 op=LOAD Jan 28 01:15:12.487000 audit: BPF prog-id=69 op=UNLOAD Jan 28 01:15:12.487000 audit: BPF prog-id=128 op=LOAD Jan 28 01:15:12.487000 audit: BPF prog-id=129 op=LOAD Jan 28 01:15:12.487000 audit: BPF prog-id=70 op=UNLOAD Jan 28 01:15:12.487000 audit: BPF prog-id=71 op=UNLOAD Jan 28 01:15:12.488000 audit: BPF prog-id=130 op=LOAD Jan 28 01:15:12.488000 audit: BPF prog-id=64 op=UNLOAD Jan 28 01:15:12.488000 audit: BPF prog-id=131 op=LOAD Jan 28 01:15:12.488000 audit: BPF prog-id=132 op=LOAD Jan 28 01:15:12.488000 audit: BPF prog-id=65 op=UNLOAD Jan 28 01:15:12.488000 audit: BPF prog-id=66 op=UNLOAD Jan 28 01:15:12.626286 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 28 01:15:12.625000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:15:12.640338 (kubelet)[2910]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 28 01:15:12.679944 kubelet[2910]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 28 01:15:12.679944 kubelet[2910]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jan 28 01:15:12.679944 kubelet[2910]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 28 01:15:12.680335 kubelet[2910]: I0128 01:15:12.680017 2910 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 28 01:15:12.686077 kubelet[2910]: I0128 01:15:12.685980 2910 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Jan 28 01:15:12.686239 kubelet[2910]: I0128 01:15:12.686151 2910 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 28 01:15:12.691081 kubelet[2910]: I0128 01:15:12.690254 2910 server.go:956] "Client rotation is on, will bootstrap in background" Jan 28 01:15:12.692534 kubelet[2910]: I0128 01:15:12.692506 2910 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Jan 28 01:15:12.694501 kubelet[2910]: I0128 01:15:12.694478 2910 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 28 01:15:12.697957 kubelet[2910]: I0128 01:15:12.697945 2910 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 28 01:15:12.700854 kubelet[2910]: I0128 01:15:12.700838 2910 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 28 01:15:12.701133 kubelet[2910]: I0128 01:15:12.701107 2910 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 28 01:15:12.701336 kubelet[2910]: I0128 01:15:12.701186 2910 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4593-0-0-n-62761e1650","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 28 01:15:12.701451 kubelet[2910]: I0128 01:15:12.701443 2910 topology_manager.go:138] "Creating topology manager with none policy" Jan 28 01:15:12.701486 kubelet[2910]: I0128 01:15:12.701482 2910 container_manager_linux.go:303] "Creating device plugin manager" Jan 28 01:15:12.701558 kubelet[2910]: I0128 01:15:12.701553 2910 state_mem.go:36] "Initialized new in-memory state store" Jan 28 01:15:12.701758 kubelet[2910]: I0128 01:15:12.701750 2910 kubelet.go:480] "Attempting to sync node with API server" Jan 28 01:15:12.701814 kubelet[2910]: I0128 01:15:12.701808 2910 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 28 01:15:12.701871 kubelet[2910]: I0128 01:15:12.701866 2910 kubelet.go:386] "Adding apiserver pod source" Jan 28 01:15:12.701914 kubelet[2910]: I0128 01:15:12.701910 2910 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 28 01:15:12.703276 kubelet[2910]: I0128 01:15:12.703260 2910 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Jan 28 01:15:12.703704 kubelet[2910]: I0128 01:15:12.703682 2910 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Jan 28 01:15:12.706939 kubelet[2910]: I0128 01:15:12.706738 2910 watchdog_linux.go:99] "Systemd watchdog is not enabled" Jan 28 01:15:12.706939 kubelet[2910]: I0128 01:15:12.706774 2910 server.go:1289] "Started kubelet" Jan 28 01:15:12.717221 kubelet[2910]: I0128 01:15:12.717192 2910 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 28 01:15:12.720220 kubelet[2910]: I0128 01:15:12.719307 2910 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Jan 28 01:15:12.720717 kubelet[2910]: I0128 01:15:12.720702 2910 server.go:317] "Adding debug handlers to kubelet server" Jan 28 01:15:12.725022 kubelet[2910]: I0128 01:15:12.723500 2910 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 28 01:15:12.725022 kubelet[2910]: I0128 01:15:12.723677 2910 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 28 01:15:12.725022 kubelet[2910]: I0128 01:15:12.723884 2910 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 28 01:15:12.727497 kubelet[2910]: I0128 01:15:12.727481 2910 volume_manager.go:297] "Starting Kubelet Volume Manager" Jan 28 01:15:12.727677 kubelet[2910]: E0128 01:15:12.727664 2910 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4593-0-0-n-62761e1650\" not found" Jan 28 01:15:12.736563 kubelet[2910]: I0128 01:15:12.734934 2910 factory.go:223] Registration of the systemd container factory successfully Jan 28 01:15:12.736563 kubelet[2910]: I0128 01:15:12.736128 2910 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 28 01:15:12.740020 kubelet[2910]: I0128 01:15:12.739945 2910 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Jan 28 01:15:12.740483 kubelet[2910]: I0128 01:15:12.740468 2910 reconciler.go:26] "Reconciler: start to sync state" Jan 28 01:15:12.743975 kubelet[2910]: I0128 01:15:12.743642 2910 factory.go:223] Registration of the containerd container factory successfully Jan 28 01:15:12.745168 kubelet[2910]: I0128 01:15:12.744960 2910 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Jan 28 01:15:12.745897 kubelet[2910]: I0128 01:15:12.745876 2910 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Jan 28 01:15:12.745897 kubelet[2910]: I0128 01:15:12.745894 2910 status_manager.go:230] "Starting to sync pod status with apiserver" Jan 28 01:15:12.745972 kubelet[2910]: I0128 01:15:12.745914 2910 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jan 28 01:15:12.745972 kubelet[2910]: I0128 01:15:12.745921 2910 kubelet.go:2436] "Starting kubelet main sync loop" Jan 28 01:15:12.746027 kubelet[2910]: E0128 01:15:12.745982 2910 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 28 01:15:12.816824 kubelet[2910]: I0128 01:15:12.816798 2910 cpu_manager.go:221] "Starting CPU manager" policy="none" Jan 28 01:15:12.816824 kubelet[2910]: I0128 01:15:12.816816 2910 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jan 28 01:15:12.816992 kubelet[2910]: I0128 01:15:12.816845 2910 state_mem.go:36] "Initialized new in-memory state store" Jan 28 01:15:12.817328 kubelet[2910]: I0128 01:15:12.816998 2910 state_mem.go:88] "Updated default CPUSet" cpuSet="" Jan 28 01:15:12.817328 kubelet[2910]: I0128 01:15:12.817245 2910 state_mem.go:96] "Updated CPUSet assignments" assignments={} Jan 28 01:15:12.817328 kubelet[2910]: I0128 01:15:12.817282 2910 policy_none.go:49] "None policy: Start" Jan 28 01:15:12.817328 kubelet[2910]: I0128 01:15:12.817294 2910 memory_manager.go:186] "Starting memorymanager" policy="None" Jan 28 01:15:12.817328 kubelet[2910]: I0128 01:15:12.817305 2910 state_mem.go:35] "Initializing new in-memory state store" Jan 28 01:15:12.817447 kubelet[2910]: I0128 01:15:12.817400 2910 state_mem.go:75] "Updated machine memory state" Jan 28 01:15:12.823043 kubelet[2910]: E0128 01:15:12.822916 2910 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Jan 28 01:15:12.823627 kubelet[2910]: I0128 01:15:12.823600 2910 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 28 01:15:12.823667 kubelet[2910]: I0128 01:15:12.823615 2910 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 28 01:15:12.824771 kubelet[2910]: I0128 01:15:12.823816 2910 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 28 01:15:12.826695 kubelet[2910]: E0128 01:15:12.825738 2910 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jan 28 01:15:12.847601 kubelet[2910]: I0128 01:15:12.847571 2910 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4593-0-0-n-62761e1650" Jan 28 01:15:12.847922 kubelet[2910]: I0128 01:15:12.847907 2910 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4593-0-0-n-62761e1650" Jan 28 01:15:12.849022 kubelet[2910]: I0128 01:15:12.848265 2910 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4593-0-0-n-62761e1650" Jan 28 01:15:12.927759 kubelet[2910]: I0128 01:15:12.927720 2910 kubelet_node_status.go:75] "Attempting to register node" node="ci-4593-0-0-n-62761e1650" Jan 28 01:15:12.942299 kubelet[2910]: I0128 01:15:12.941001 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/1d9bc333489a6a6279cb4f704c064614-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4593-0-0-n-62761e1650\" (UID: \"1d9bc333489a6a6279cb4f704c064614\") " pod="kube-system/kube-controller-manager-ci-4593-0-0-n-62761e1650" Jan 28 01:15:12.942299 kubelet[2910]: I0128 01:15:12.941054 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/bc4d8789fc87387958ceb01a7e9cae39-kubeconfig\") pod \"kube-scheduler-ci-4593-0-0-n-62761e1650\" (UID: \"bc4d8789fc87387958ceb01a7e9cae39\") " pod="kube-system/kube-scheduler-ci-4593-0-0-n-62761e1650" Jan 28 01:15:12.942299 kubelet[2910]: I0128 01:15:12.941071 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/1d9bc333489a6a6279cb4f704c064614-flexvolume-dir\") pod \"kube-controller-manager-ci-4593-0-0-n-62761e1650\" (UID: \"1d9bc333489a6a6279cb4f704c064614\") " pod="kube-system/kube-controller-manager-ci-4593-0-0-n-62761e1650" Jan 28 01:15:12.942299 kubelet[2910]: I0128 01:15:12.941088 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/1d9bc333489a6a6279cb4f704c064614-k8s-certs\") pod \"kube-controller-manager-ci-4593-0-0-n-62761e1650\" (UID: \"1d9bc333489a6a6279cb4f704c064614\") " pod="kube-system/kube-controller-manager-ci-4593-0-0-n-62761e1650" Jan 28 01:15:12.942299 kubelet[2910]: I0128 01:15:12.941104 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/6b411eb2a0f2e7361dac8f8e13a6c7e4-ca-certs\") pod \"kube-apiserver-ci-4593-0-0-n-62761e1650\" (UID: \"6b411eb2a0f2e7361dac8f8e13a6c7e4\") " pod="kube-system/kube-apiserver-ci-4593-0-0-n-62761e1650" Jan 28 01:15:12.942525 kubelet[2910]: I0128 01:15:12.941117 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/6b411eb2a0f2e7361dac8f8e13a6c7e4-k8s-certs\") pod \"kube-apiserver-ci-4593-0-0-n-62761e1650\" (UID: \"6b411eb2a0f2e7361dac8f8e13a6c7e4\") " pod="kube-system/kube-apiserver-ci-4593-0-0-n-62761e1650" Jan 28 01:15:12.942525 kubelet[2910]: I0128 01:15:12.941133 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/6b411eb2a0f2e7361dac8f8e13a6c7e4-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4593-0-0-n-62761e1650\" (UID: \"6b411eb2a0f2e7361dac8f8e13a6c7e4\") " pod="kube-system/kube-apiserver-ci-4593-0-0-n-62761e1650" Jan 28 01:15:12.942525 kubelet[2910]: I0128 01:15:12.941148 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/1d9bc333489a6a6279cb4f704c064614-ca-certs\") pod \"kube-controller-manager-ci-4593-0-0-n-62761e1650\" (UID: \"1d9bc333489a6a6279cb4f704c064614\") " pod="kube-system/kube-controller-manager-ci-4593-0-0-n-62761e1650" Jan 28 01:15:12.942525 kubelet[2910]: I0128 01:15:12.941162 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/1d9bc333489a6a6279cb4f704c064614-kubeconfig\") pod \"kube-controller-manager-ci-4593-0-0-n-62761e1650\" (UID: \"1d9bc333489a6a6279cb4f704c064614\") " pod="kube-system/kube-controller-manager-ci-4593-0-0-n-62761e1650" Jan 28 01:15:12.942525 kubelet[2910]: I0128 01:15:12.942422 2910 kubelet_node_status.go:124] "Node was previously registered" node="ci-4593-0-0-n-62761e1650" Jan 28 01:15:12.942525 kubelet[2910]: I0128 01:15:12.942500 2910 kubelet_node_status.go:78] "Successfully registered node" node="ci-4593-0-0-n-62761e1650" Jan 28 01:15:13.702810 kubelet[2910]: I0128 01:15:13.702653 2910 apiserver.go:52] "Watching apiserver" Jan 28 01:15:13.740909 kubelet[2910]: I0128 01:15:13.740570 2910 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Jan 28 01:15:13.757280 kubelet[2910]: I0128 01:15:13.757210 2910 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4593-0-0-n-62761e1650" podStartSLOduration=1.757164607 podStartE2EDuration="1.757164607s" podCreationTimestamp="2026-01-28 01:15:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 01:15:13.757111579 +0000 UTC m=+1.112963044" watchObservedRunningTime="2026-01-28 01:15:13.757164607 +0000 UTC m=+1.113016051" Jan 28 01:15:13.783039 kubelet[2910]: I0128 01:15:13.782818 2910 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4593-0-0-n-62761e1650" podStartSLOduration=1.7828018110000001 podStartE2EDuration="1.782801811s" podCreationTimestamp="2026-01-28 01:15:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 01:15:13.771280114 +0000 UTC m=+1.127131580" watchObservedRunningTime="2026-01-28 01:15:13.782801811 +0000 UTC m=+1.138653254" Jan 28 01:15:13.796502 kubelet[2910]: I0128 01:15:13.796459 2910 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4593-0-0-n-62761e1650" podStartSLOduration=1.7964401319999999 podStartE2EDuration="1.796440132s" podCreationTimestamp="2026-01-28 01:15:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 01:15:13.783445005 +0000 UTC m=+1.139296470" watchObservedRunningTime="2026-01-28 01:15:13.796440132 +0000 UTC m=+1.152291657" Jan 28 01:15:17.481997 kubelet[2910]: I0128 01:15:17.481928 2910 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Jan 28 01:15:17.482450 kubelet[2910]: I0128 01:15:17.482407 2910 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Jan 28 01:15:17.482489 containerd[1674]: time="2026-01-28T01:15:17.482243708Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Jan 28 01:15:18.238769 systemd[1]: Created slice kubepods-besteffort-podf1eaa242_3bbf_4134_b13b_ec9cb75217b0.slice - libcontainer container kubepods-besteffort-podf1eaa242_3bbf_4134_b13b_ec9cb75217b0.slice. Jan 28 01:15:18.272874 kubelet[2910]: I0128 01:15:18.272754 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f1eaa242-3bbf-4134-b13b-ec9cb75217b0-lib-modules\") pod \"kube-proxy-rtmvx\" (UID: \"f1eaa242-3bbf-4134-b13b-ec9cb75217b0\") " pod="kube-system/kube-proxy-rtmvx" Jan 28 01:15:18.272874 kubelet[2910]: I0128 01:15:18.272825 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/f1eaa242-3bbf-4134-b13b-ec9cb75217b0-kube-proxy\") pod \"kube-proxy-rtmvx\" (UID: \"f1eaa242-3bbf-4134-b13b-ec9cb75217b0\") " pod="kube-system/kube-proxy-rtmvx" Jan 28 01:15:18.272874 kubelet[2910]: I0128 01:15:18.272844 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/f1eaa242-3bbf-4134-b13b-ec9cb75217b0-xtables-lock\") pod \"kube-proxy-rtmvx\" (UID: \"f1eaa242-3bbf-4134-b13b-ec9cb75217b0\") " pod="kube-system/kube-proxy-rtmvx" Jan 28 01:15:18.273223 kubelet[2910]: I0128 01:15:18.273103 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f682j\" (UniqueName: \"kubernetes.io/projected/f1eaa242-3bbf-4134-b13b-ec9cb75217b0-kube-api-access-f682j\") pod \"kube-proxy-rtmvx\" (UID: \"f1eaa242-3bbf-4134-b13b-ec9cb75217b0\") " pod="kube-system/kube-proxy-rtmvx" Jan 28 01:15:18.548924 containerd[1674]: time="2026-01-28T01:15:18.548761714Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-rtmvx,Uid:f1eaa242-3bbf-4134-b13b-ec9cb75217b0,Namespace:kube-system,Attempt:0,}" Jan 28 01:15:18.575614 containerd[1674]: time="2026-01-28T01:15:18.575567726Z" level=info msg="connecting to shim 74d5596222ee60da4e6264ad5d4922b63fd8316997f9f69439be61f0f4b715b0" address="unix:///run/containerd/s/a55307b57f476818a3f4b877bed468a1fa64efdd7ed541b9aaaa6e6ffdf0c86d" namespace=k8s.io protocol=ttrpc version=3 Jan 28 01:15:18.611255 systemd[1]: Started cri-containerd-74d5596222ee60da4e6264ad5d4922b63fd8316997f9f69439be61f0f4b715b0.scope - libcontainer container 74d5596222ee60da4e6264ad5d4922b63fd8316997f9f69439be61f0f4b715b0. Jan 28 01:15:18.623000 audit: BPF prog-id=133 op=LOAD Jan 28 01:15:18.625455 kernel: kauditd_printk_skb: 200 callbacks suppressed Jan 28 01:15:18.625503 kernel: audit: type=1334 audit(1769562918.623:442): prog-id=133 op=LOAD Jan 28 01:15:18.626000 audit: BPF prog-id=134 op=LOAD Jan 28 01:15:18.630030 kernel: audit: type=1334 audit(1769562918.626:443): prog-id=134 op=LOAD Jan 28 01:15:18.630155 kernel: audit: type=1300 audit(1769562918.626:443): arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001b0238 a2=98 a3=0 items=0 ppid=2967 pid=2978 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:18.626000 audit[2978]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001b0238 a2=98 a3=0 items=0 ppid=2967 pid=2978 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:18.626000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3734643535393632323265653630646134653632363461643564343932 Jan 28 01:15:18.636638 kernel: audit: type=1327 audit(1769562918.626:443): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3734643535393632323265653630646134653632363461643564343932 Jan 28 01:15:18.627000 audit: BPF prog-id=134 op=UNLOAD Jan 28 01:15:18.627000 audit[2978]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2967 pid=2978 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:18.643179 kernel: audit: type=1334 audit(1769562918.627:444): prog-id=134 op=UNLOAD Jan 28 01:15:18.643214 kernel: audit: type=1300 audit(1769562918.627:444): arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2967 pid=2978 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:18.627000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3734643535393632323265653630646134653632363461643564343932 Jan 28 01:15:18.628000 audit: BPF prog-id=135 op=LOAD Jan 28 01:15:18.651267 kernel: audit: type=1327 audit(1769562918.627:444): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3734643535393632323265653630646134653632363461643564343932 Jan 28 01:15:18.651302 kernel: audit: type=1334 audit(1769562918.628:445): prog-id=135 op=LOAD Jan 28 01:15:18.628000 audit[2978]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001b0488 a2=98 a3=0 items=0 ppid=2967 pid=2978 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:18.628000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3734643535393632323265653630646134653632363461643564343932 Jan 28 01:15:18.660929 kernel: audit: type=1300 audit(1769562918.628:445): arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001b0488 a2=98 a3=0 items=0 ppid=2967 pid=2978 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:18.661053 kernel: audit: type=1327 audit(1769562918.628:445): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3734643535393632323265653630646134653632363461643564343932 Jan 28 01:15:18.628000 audit: BPF prog-id=136 op=LOAD Jan 28 01:15:18.628000 audit[2978]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c0001b0218 a2=98 a3=0 items=0 ppid=2967 pid=2978 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:18.628000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3734643535393632323265653630646134653632363461643564343932 Jan 28 01:15:18.628000 audit: BPF prog-id=136 op=UNLOAD Jan 28 01:15:18.628000 audit[2978]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2967 pid=2978 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:18.628000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3734643535393632323265653630646134653632363461643564343932 Jan 28 01:15:18.628000 audit: BPF prog-id=135 op=UNLOAD Jan 28 01:15:18.628000 audit[2978]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2967 pid=2978 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:18.628000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3734643535393632323265653630646134653632363461643564343932 Jan 28 01:15:18.628000 audit: BPF prog-id=137 op=LOAD Jan 28 01:15:18.628000 audit[2978]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001b06e8 a2=98 a3=0 items=0 ppid=2967 pid=2978 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:18.628000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3734643535393632323265653630646134653632363461643564343932 Jan 28 01:15:18.678585 containerd[1674]: time="2026-01-28T01:15:18.678540059Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-rtmvx,Uid:f1eaa242-3bbf-4134-b13b-ec9cb75217b0,Namespace:kube-system,Attempt:0,} returns sandbox id \"74d5596222ee60da4e6264ad5d4922b63fd8316997f9f69439be61f0f4b715b0\"" Jan 28 01:15:18.688633 containerd[1674]: time="2026-01-28T01:15:18.688596635Z" level=info msg="CreateContainer within sandbox \"74d5596222ee60da4e6264ad5d4922b63fd8316997f9f69439be61f0f4b715b0\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Jan 28 01:15:18.706642 containerd[1674]: time="2026-01-28T01:15:18.706602337Z" level=info msg="Container 5af2ed17c1b13ce173285ea19144aea06ce70b4151dffe055e67f6e853db036c: CDI devices from CRI Config.CDIDevices: []" Jan 28 01:15:18.725774 containerd[1674]: time="2026-01-28T01:15:18.725709173Z" level=info msg="CreateContainer within sandbox \"74d5596222ee60da4e6264ad5d4922b63fd8316997f9f69439be61f0f4b715b0\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"5af2ed17c1b13ce173285ea19144aea06ce70b4151dffe055e67f6e853db036c\"" Jan 28 01:15:18.735261 containerd[1674]: time="2026-01-28T01:15:18.735224694Z" level=info msg="StartContainer for \"5af2ed17c1b13ce173285ea19144aea06ce70b4151dffe055e67f6e853db036c\"" Jan 28 01:15:18.738704 containerd[1674]: time="2026-01-28T01:15:18.738660473Z" level=info msg="connecting to shim 5af2ed17c1b13ce173285ea19144aea06ce70b4151dffe055e67f6e853db036c" address="unix:///run/containerd/s/a55307b57f476818a3f4b877bed468a1fa64efdd7ed541b9aaaa6e6ffdf0c86d" protocol=ttrpc version=3 Jan 28 01:15:18.779521 systemd[1]: Started cri-containerd-5af2ed17c1b13ce173285ea19144aea06ce70b4151dffe055e67f6e853db036c.scope - libcontainer container 5af2ed17c1b13ce173285ea19144aea06ce70b4151dffe055e67f6e853db036c. Jan 28 01:15:18.811079 systemd[1]: Created slice kubepods-besteffort-podfedefd09_4499_4313_b871_41abc589dc82.slice - libcontainer container kubepods-besteffort-podfedefd09_4499_4313_b871_41abc589dc82.slice. Jan 28 01:15:18.876619 kubelet[2910]: I0128 01:15:18.876525 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/fedefd09-4499-4313-b871-41abc589dc82-var-lib-calico\") pod \"tigera-operator-7dcd859c48-h6v6g\" (UID: \"fedefd09-4499-4313-b871-41abc589dc82\") " pod="tigera-operator/tigera-operator-7dcd859c48-h6v6g" Jan 28 01:15:18.876619 kubelet[2910]: I0128 01:15:18.876585 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nksnq\" (UniqueName: \"kubernetes.io/projected/fedefd09-4499-4313-b871-41abc589dc82-kube-api-access-nksnq\") pod \"tigera-operator-7dcd859c48-h6v6g\" (UID: \"fedefd09-4499-4313-b871-41abc589dc82\") " pod="tigera-operator/tigera-operator-7dcd859c48-h6v6g" Jan 28 01:15:18.880000 audit: BPF prog-id=138 op=LOAD Jan 28 01:15:18.880000 audit[3006]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001b0488 a2=98 a3=0 items=0 ppid=2967 pid=3006 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:18.880000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3561663265643137633162313363653137333238356561313931343461 Jan 28 01:15:18.880000 audit: BPF prog-id=139 op=LOAD Jan 28 01:15:18.880000 audit[3006]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c0001b0218 a2=98 a3=0 items=0 ppid=2967 pid=3006 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:18.880000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3561663265643137633162313363653137333238356561313931343461 Jan 28 01:15:18.880000 audit: BPF prog-id=139 op=UNLOAD Jan 28 01:15:18.880000 audit[3006]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2967 pid=3006 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:18.880000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3561663265643137633162313363653137333238356561313931343461 Jan 28 01:15:18.880000 audit: BPF prog-id=138 op=UNLOAD Jan 28 01:15:18.880000 audit[3006]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2967 pid=3006 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:18.880000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3561663265643137633162313363653137333238356561313931343461 Jan 28 01:15:18.880000 audit: BPF prog-id=140 op=LOAD Jan 28 01:15:18.880000 audit[3006]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001b06e8 a2=98 a3=0 items=0 ppid=2967 pid=3006 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:18.880000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3561663265643137633162313363653137333238356561313931343461 Jan 28 01:15:18.903716 containerd[1674]: time="2026-01-28T01:15:18.903674337Z" level=info msg="StartContainer for \"5af2ed17c1b13ce173285ea19144aea06ce70b4151dffe055e67f6e853db036c\" returns successfully" Jan 28 01:15:19.024000 audit[3070]: NETFILTER_CFG table=mangle:54 family=2 entries=1 op=nft_register_chain pid=3070 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:15:19.024000 audit[3070]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7fff5ecbb6d0 a2=0 a3=7fff5ecbb6bc items=0 ppid=3019 pid=3070 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:19.024000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Jan 28 01:15:19.025000 audit[3071]: NETFILTER_CFG table=mangle:55 family=10 entries=1 op=nft_register_chain pid=3071 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:15:19.025000 audit[3071]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7fff56680e70 a2=0 a3=7fff56680e5c items=0 ppid=3019 pid=3071 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:19.025000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Jan 28 01:15:19.026000 audit[3074]: NETFILTER_CFG table=nat:56 family=10 entries=1 op=nft_register_chain pid=3074 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:15:19.027000 audit[3073]: NETFILTER_CFG table=nat:57 family=2 entries=1 op=nft_register_chain pid=3073 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:15:19.027000 audit[3073]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe4abccfa0 a2=0 a3=7ffe4abccf8c items=0 ppid=3019 pid=3073 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:19.027000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Jan 28 01:15:19.026000 audit[3074]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffcd73fe4f0 a2=0 a3=7ffcd73fe4dc items=0 ppid=3019 pid=3074 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:19.026000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Jan 28 01:15:19.028000 audit[3075]: NETFILTER_CFG table=filter:58 family=2 entries=1 op=nft_register_chain pid=3075 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:15:19.029000 audit[3076]: NETFILTER_CFG table=filter:59 family=10 entries=1 op=nft_register_chain pid=3076 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:15:19.028000 audit[3075]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7fffdbaccdd0 a2=0 a3=7fffdbaccdbc items=0 ppid=3019 pid=3075 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:19.028000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Jan 28 01:15:19.029000 audit[3076]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffd86850d10 a2=0 a3=7ffd86850cfc items=0 ppid=3019 pid=3076 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:19.029000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Jan 28 01:15:19.117181 containerd[1674]: time="2026-01-28T01:15:19.117084340Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-h6v6g,Uid:fedefd09-4499-4313-b871-41abc589dc82,Namespace:tigera-operator,Attempt:0,}" Jan 28 01:15:19.135000 audit[3079]: NETFILTER_CFG table=filter:60 family=2 entries=1 op=nft_register_chain pid=3079 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:15:19.135000 audit[3079]: SYSCALL arch=c000003e syscall=46 success=yes exit=108 a0=3 a1=7ffe9ea69a30 a2=0 a3=7ffe9ea69a1c items=0 ppid=3019 pid=3079 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:19.135000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Jan 28 01:15:19.140000 audit[3081]: NETFILTER_CFG table=filter:61 family=2 entries=1 op=nft_register_rule pid=3081 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:15:19.140000 audit[3081]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7fffa0681b40 a2=0 a3=7fffa0681b2c items=0 ppid=3019 pid=3081 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:19.140000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276696365 Jan 28 01:15:19.146827 containerd[1674]: time="2026-01-28T01:15:19.146791009Z" level=info msg="connecting to shim 3ad4aab15ad1bec082c373d2e01cdc315cb022af293343f5b7ab0212d77df4ce" address="unix:///run/containerd/s/76156d2b767e5dc73bde3a7de5908e721044acbed11f7f52a9455991f54d77d1" namespace=k8s.io protocol=ttrpc version=3 Jan 28 01:15:19.145000 audit[3092]: NETFILTER_CFG table=filter:62 family=2 entries=1 op=nft_register_rule pid=3092 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:15:19.145000 audit[3092]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7ffc48c7ed50 a2=0 a3=7ffc48c7ed3c items=0 ppid=3019 pid=3092 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:19.145000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669 Jan 28 01:15:19.147000 audit[3094]: NETFILTER_CFG table=filter:63 family=2 entries=1 op=nft_register_chain pid=3094 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:15:19.147000 audit[3094]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc497e54e0 a2=0 a3=7ffc497e54cc items=0 ppid=3019 pid=3094 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:19.147000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Jan 28 01:15:19.149000 audit[3100]: NETFILTER_CFG table=filter:64 family=2 entries=1 op=nft_register_rule pid=3100 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:15:19.149000 audit[3100]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffe78b5bdf0 a2=0 a3=7ffe78b5bddc items=0 ppid=3019 pid=3100 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:19.149000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Jan 28 01:15:19.151000 audit[3104]: NETFILTER_CFG table=filter:65 family=2 entries=1 op=nft_register_chain pid=3104 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:15:19.151000 audit[3104]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffcfc674e20 a2=0 a3=7ffcfc674e0c items=0 ppid=3019 pid=3104 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:19.151000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Jan 28 01:15:19.153000 audit[3109]: NETFILTER_CFG table=filter:66 family=2 entries=1 op=nft_register_rule pid=3109 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:15:19.153000 audit[3109]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7ffdd302a8b0 a2=0 a3=7ffdd302a89c items=0 ppid=3019 pid=3109 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:19.153000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Jan 28 01:15:19.157000 audit[3118]: NETFILTER_CFG table=filter:67 family=2 entries=1 op=nft_register_rule pid=3118 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:15:19.157000 audit[3118]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7ffd8a9d8290 a2=0 a3=7ffd8a9d827c items=0 ppid=3019 pid=3118 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:19.157000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D53 Jan 28 01:15:19.159000 audit[3119]: NETFILTER_CFG table=filter:68 family=2 entries=1 op=nft_register_chain pid=3119 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:15:19.159000 audit[3119]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff92625820 a2=0 a3=7fff9262580c items=0 ppid=3019 pid=3119 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:19.159000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Jan 28 01:15:19.162000 audit[3125]: NETFILTER_CFG table=filter:69 family=2 entries=1 op=nft_register_rule pid=3125 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:15:19.162000 audit[3125]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7fffc7492370 a2=0 a3=7fffc749235c items=0 ppid=3019 pid=3125 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:19.162000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Jan 28 01:15:19.164000 audit[3127]: NETFILTER_CFG table=filter:70 family=2 entries=1 op=nft_register_chain pid=3127 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:15:19.164000 audit[3127]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffe2c1da5b0 a2=0 a3=7ffe2c1da59c items=0 ppid=3019 pid=3127 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:19.164000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Jan 28 01:15:19.169000 audit[3129]: NETFILTER_CFG table=filter:71 family=2 entries=1 op=nft_register_rule pid=3129 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:15:19.169000 audit[3129]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffe8571a0f0 a2=0 a3=7ffe8571a0dc items=0 ppid=3019 pid=3129 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:19.169000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Jan 28 01:15:19.173202 systemd[1]: Started cri-containerd-3ad4aab15ad1bec082c373d2e01cdc315cb022af293343f5b7ab0212d77df4ce.scope - libcontainer container 3ad4aab15ad1bec082c373d2e01cdc315cb022af293343f5b7ab0212d77df4ce. Jan 28 01:15:19.176000 audit[3134]: NETFILTER_CFG table=filter:72 family=2 entries=1 op=nft_register_rule pid=3134 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:15:19.176000 audit[3134]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7fffecd5cdf0 a2=0 a3=7fffecd5cddc items=0 ppid=3019 pid=3134 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:19.176000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Jan 28 01:15:19.180000 audit[3142]: NETFILTER_CFG table=filter:73 family=2 entries=1 op=nft_register_rule pid=3142 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:15:19.180000 audit[3142]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffd4fca0fb0 a2=0 a3=7ffd4fca0f9c items=0 ppid=3019 pid=3142 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:19.180000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Jan 28 01:15:19.181000 audit[3143]: NETFILTER_CFG table=nat:74 family=2 entries=1 op=nft_register_chain pid=3143 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:15:19.181000 audit[3143]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffe2f6d0dd0 a2=0 a3=7ffe2f6d0dbc items=0 ppid=3019 pid=3143 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:19.181000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Jan 28 01:15:19.184000 audit: BPF prog-id=141 op=LOAD Jan 28 01:15:19.184000 audit: BPF prog-id=142 op=LOAD Jan 28 01:15:19.184000 audit[3108]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=3091 pid=3108 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:19.184000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3361643461616231356164316265633038326333373364326530316364 Jan 28 01:15:19.184000 audit: BPF prog-id=142 op=UNLOAD Jan 28 01:15:19.184000 audit[3108]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3091 pid=3108 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:19.184000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3361643461616231356164316265633038326333373364326530316364 Jan 28 01:15:19.185000 audit: BPF prog-id=143 op=LOAD Jan 28 01:15:19.185000 audit[3108]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=3091 pid=3108 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:19.185000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3361643461616231356164316265633038326333373364326530316364 Jan 28 01:15:19.185000 audit: BPF prog-id=144 op=LOAD Jan 28 01:15:19.185000 audit[3108]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=3091 pid=3108 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:19.185000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3361643461616231356164316265633038326333373364326530316364 Jan 28 01:15:19.185000 audit: BPF prog-id=144 op=UNLOAD Jan 28 01:15:19.185000 audit[3108]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3091 pid=3108 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:19.185000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3361643461616231356164316265633038326333373364326530316364 Jan 28 01:15:19.185000 audit: BPF prog-id=143 op=UNLOAD Jan 28 01:15:19.185000 audit[3108]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3091 pid=3108 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:19.185000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3361643461616231356164316265633038326333373364326530316364 Jan 28 01:15:19.185000 audit: BPF prog-id=145 op=LOAD Jan 28 01:15:19.185000 audit[3108]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=3091 pid=3108 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:19.185000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3361643461616231356164316265633038326333373364326530316364 Jan 28 01:15:19.186000 audit[3145]: NETFILTER_CFG table=nat:75 family=2 entries=1 op=nft_register_rule pid=3145 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:15:19.186000 audit[3145]: SYSCALL arch=c000003e syscall=46 success=yes exit=524 a0=3 a1=7ffc5c9165d0 a2=0 a3=7ffc5c9165bc items=0 ppid=3019 pid=3145 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:19.186000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 28 01:15:19.190000 audit[3148]: NETFILTER_CFG table=nat:76 family=2 entries=1 op=nft_register_rule pid=3148 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:15:19.190000 audit[3148]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7fff384567f0 a2=0 a3=7fff384567dc items=0 ppid=3019 pid=3148 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:19.190000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 28 01:15:19.191000 audit[3149]: NETFILTER_CFG table=nat:77 family=2 entries=1 op=nft_register_chain pid=3149 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:15:19.191000 audit[3149]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe8138d6f0 a2=0 a3=7ffe8138d6dc items=0 ppid=3019 pid=3149 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:19.191000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Jan 28 01:15:19.194000 audit[3151]: NETFILTER_CFG table=nat:78 family=2 entries=1 op=nft_register_rule pid=3151 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:15:19.194000 audit[3151]: SYSCALL arch=c000003e syscall=46 success=yes exit=532 a0=3 a1=7ffdab976580 a2=0 a3=7ffdab97656c items=0 ppid=3019 pid=3151 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:19.194000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Jan 28 01:15:19.222000 audit[3157]: NETFILTER_CFG table=filter:79 family=2 entries=8 op=nft_register_rule pid=3157 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:15:19.222000 audit[3157]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7ffd1573b590 a2=0 a3=7ffd1573b57c items=0 ppid=3019 pid=3157 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:19.222000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:15:19.224928 containerd[1674]: time="2026-01-28T01:15:19.224885382Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-h6v6g,Uid:fedefd09-4499-4313-b871-41abc589dc82,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"3ad4aab15ad1bec082c373d2e01cdc315cb022af293343f5b7ab0212d77df4ce\"" Jan 28 01:15:19.226625 containerd[1674]: time="2026-01-28T01:15:19.226607591Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\"" Jan 28 01:15:19.231000 audit[3157]: NETFILTER_CFG table=nat:80 family=2 entries=14 op=nft_register_chain pid=3157 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:15:19.231000 audit[3157]: SYSCALL arch=c000003e syscall=46 success=yes exit=5508 a0=3 a1=7ffd1573b590 a2=0 a3=7ffd1573b57c items=0 ppid=3019 pid=3157 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:19.231000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:15:19.233000 audit[3170]: NETFILTER_CFG table=filter:81 family=10 entries=1 op=nft_register_chain pid=3170 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:15:19.233000 audit[3170]: SYSCALL arch=c000003e syscall=46 success=yes exit=108 a0=3 a1=7fffe693e2c0 a2=0 a3=7fffe693e2ac items=0 ppid=3019 pid=3170 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:19.233000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Jan 28 01:15:19.236000 audit[3172]: NETFILTER_CFG table=filter:82 family=10 entries=2 op=nft_register_chain pid=3172 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:15:19.236000 audit[3172]: SYSCALL arch=c000003e syscall=46 success=yes exit=836 a0=3 a1=7fff2d1e5fb0 a2=0 a3=7fff2d1e5f9c items=0 ppid=3019 pid=3172 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:19.236000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C6520736572766963 Jan 28 01:15:19.240000 audit[3175]: NETFILTER_CFG table=filter:83 family=10 entries=1 op=nft_register_rule pid=3175 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:15:19.240000 audit[3175]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7ffd84a0c910 a2=0 a3=7ffd84a0c8fc items=0 ppid=3019 pid=3175 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:19.240000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276 Jan 28 01:15:19.241000 audit[3176]: NETFILTER_CFG table=filter:84 family=10 entries=1 op=nft_register_chain pid=3176 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:15:19.241000 audit[3176]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffeb31175b0 a2=0 a3=7ffeb311759c items=0 ppid=3019 pid=3176 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:19.241000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Jan 28 01:15:19.244000 audit[3178]: NETFILTER_CFG table=filter:85 family=10 entries=1 op=nft_register_rule pid=3178 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:15:19.244000 audit[3178]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffe71b5a7d0 a2=0 a3=7ffe71b5a7bc items=0 ppid=3019 pid=3178 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:19.244000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Jan 28 01:15:19.245000 audit[3179]: NETFILTER_CFG table=filter:86 family=10 entries=1 op=nft_register_chain pid=3179 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:15:19.245000 audit[3179]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffff8fe5de0 a2=0 a3=7ffff8fe5dcc items=0 ppid=3019 pid=3179 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:19.245000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Jan 28 01:15:19.250000 audit[3181]: NETFILTER_CFG table=filter:87 family=10 entries=1 op=nft_register_rule pid=3181 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:15:19.250000 audit[3181]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7fff91ba7940 a2=0 a3=7fff91ba792c items=0 ppid=3019 pid=3181 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:19.250000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B554245 Jan 28 01:15:19.253000 audit[3184]: NETFILTER_CFG table=filter:88 family=10 entries=2 op=nft_register_chain pid=3184 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:15:19.253000 audit[3184]: SYSCALL arch=c000003e syscall=46 success=yes exit=828 a0=3 a1=7ffffe475a10 a2=0 a3=7ffffe4759fc items=0 ppid=3019 pid=3184 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:19.253000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Jan 28 01:15:19.255000 audit[3185]: NETFILTER_CFG table=filter:89 family=10 entries=1 op=nft_register_chain pid=3185 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:15:19.255000 audit[3185]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe1261cd80 a2=0 a3=7ffe1261cd6c items=0 ppid=3019 pid=3185 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:19.255000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Jan 28 01:15:19.258000 audit[3187]: NETFILTER_CFG table=filter:90 family=10 entries=1 op=nft_register_rule pid=3187 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:15:19.258000 audit[3187]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffd999700c0 a2=0 a3=7ffd999700ac items=0 ppid=3019 pid=3187 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:19.258000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Jan 28 01:15:19.259000 audit[3188]: NETFILTER_CFG table=filter:91 family=10 entries=1 op=nft_register_chain pid=3188 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:15:19.259000 audit[3188]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffcae679c50 a2=0 a3=7ffcae679c3c items=0 ppid=3019 pid=3188 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:19.259000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Jan 28 01:15:19.262000 audit[3190]: NETFILTER_CFG table=filter:92 family=10 entries=1 op=nft_register_rule pid=3190 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:15:19.262000 audit[3190]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffc76297b80 a2=0 a3=7ffc76297b6c items=0 ppid=3019 pid=3190 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:19.262000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Jan 28 01:15:19.266000 audit[3193]: NETFILTER_CFG table=filter:93 family=10 entries=1 op=nft_register_rule pid=3193 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:15:19.266000 audit[3193]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7fffac28c280 a2=0 a3=7fffac28c26c items=0 ppid=3019 pid=3193 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:19.266000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Jan 28 01:15:19.269000 audit[3196]: NETFILTER_CFG table=filter:94 family=10 entries=1 op=nft_register_rule pid=3196 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:15:19.269000 audit[3196]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffe348f2680 a2=0 a3=7ffe348f266c items=0 ppid=3019 pid=3196 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:19.269000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C Jan 28 01:15:19.271000 audit[3197]: NETFILTER_CFG table=nat:95 family=10 entries=1 op=nft_register_chain pid=3197 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:15:19.271000 audit[3197]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffe8f26c440 a2=0 a3=7ffe8f26c42c items=0 ppid=3019 pid=3197 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:19.271000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Jan 28 01:15:19.273000 audit[3199]: NETFILTER_CFG table=nat:96 family=10 entries=1 op=nft_register_rule pid=3199 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:15:19.273000 audit[3199]: SYSCALL arch=c000003e syscall=46 success=yes exit=524 a0=3 a1=7ffe9fb4c870 a2=0 a3=7ffe9fb4c85c items=0 ppid=3019 pid=3199 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:19.273000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 28 01:15:19.276000 audit[3202]: NETFILTER_CFG table=nat:97 family=10 entries=1 op=nft_register_rule pid=3202 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:15:19.276000 audit[3202]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffc1dad8ec0 a2=0 a3=7ffc1dad8eac items=0 ppid=3019 pid=3202 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:19.276000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 28 01:15:19.277000 audit[3203]: NETFILTER_CFG table=nat:98 family=10 entries=1 op=nft_register_chain pid=3203 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:15:19.277000 audit[3203]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffcf4c2ece0 a2=0 a3=7ffcf4c2eccc items=0 ppid=3019 pid=3203 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:19.277000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Jan 28 01:15:19.280000 audit[3205]: NETFILTER_CFG table=nat:99 family=10 entries=2 op=nft_register_chain pid=3205 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:15:19.280000 audit[3205]: SYSCALL arch=c000003e syscall=46 success=yes exit=612 a0=3 a1=7ffe3ae02680 a2=0 a3=7ffe3ae0266c items=0 ppid=3019 pid=3205 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:19.280000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Jan 28 01:15:19.281000 audit[3206]: NETFILTER_CFG table=filter:100 family=10 entries=1 op=nft_register_chain pid=3206 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:15:19.281000 audit[3206]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff004d7410 a2=0 a3=7fff004d73fc items=0 ppid=3019 pid=3206 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:19.281000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Jan 28 01:15:19.284000 audit[3208]: NETFILTER_CFG table=filter:101 family=10 entries=1 op=nft_register_rule pid=3208 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:15:19.284000 audit[3208]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7ffc47f75160 a2=0 a3=7ffc47f7514c items=0 ppid=3019 pid=3208 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:19.284000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 28 01:15:19.287000 audit[3211]: NETFILTER_CFG table=filter:102 family=10 entries=1 op=nft_register_rule pid=3211 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:15:19.287000 audit[3211]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7ffc0a9f62e0 a2=0 a3=7ffc0a9f62cc items=0 ppid=3019 pid=3211 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:19.287000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 28 01:15:19.291000 audit[3213]: NETFILTER_CFG table=filter:103 family=10 entries=3 op=nft_register_rule pid=3213 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Jan 28 01:15:19.291000 audit[3213]: SYSCALL arch=c000003e syscall=46 success=yes exit=2088 a0=3 a1=7ffee6030430 a2=0 a3=7ffee603041c items=0 ppid=3019 pid=3213 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:19.291000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:15:19.292000 audit[3213]: NETFILTER_CFG table=nat:104 family=10 entries=7 op=nft_register_chain pid=3213 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Jan 28 01:15:19.292000 audit[3213]: SYSCALL arch=c000003e syscall=46 success=yes exit=2056 a0=3 a1=7ffee6030430 a2=0 a3=7ffee603041c items=0 ppid=3019 pid=3213 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:19.292000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:15:19.826252 kubelet[2910]: I0128 01:15:19.825340 2910 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-rtmvx" podStartSLOduration=1.825320702 podStartE2EDuration="1.825320702s" podCreationTimestamp="2026-01-28 01:15:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 01:15:19.825225415 +0000 UTC m=+7.181076881" watchObservedRunningTime="2026-01-28 01:15:19.825320702 +0000 UTC m=+7.181172185" Jan 28 01:15:20.902123 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2885857274.mount: Deactivated successfully. Jan 28 01:15:21.349469 containerd[1674]: time="2026-01-28T01:15:21.349408227Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:15:21.352081 containerd[1674]: time="2026-01-28T01:15:21.352041485Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.7: active requests=0, bytes read=23558205" Jan 28 01:15:21.354093 containerd[1674]: time="2026-01-28T01:15:21.353918093Z" level=info msg="ImageCreate event name:\"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:15:21.356579 containerd[1674]: time="2026-01-28T01:15:21.356545353Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:15:21.357153 containerd[1674]: time="2026-01-28T01:15:21.357133508Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.7\" with image id \"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\", repo tag \"quay.io/tigera/operator:v1.38.7\", repo digest \"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\", size \"25057686\" in 2.130422552s" Jan 28 01:15:21.357210 containerd[1674]: time="2026-01-28T01:15:21.357159532Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\" returns image reference \"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\"" Jan 28 01:15:21.364022 containerd[1674]: time="2026-01-28T01:15:21.363401124Z" level=info msg="CreateContainer within sandbox \"3ad4aab15ad1bec082c373d2e01cdc315cb022af293343f5b7ab0212d77df4ce\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Jan 28 01:15:21.376763 containerd[1674]: time="2026-01-28T01:15:21.376725750Z" level=info msg="Container 2374a2656d3ad6cdf412683953a7f092a77a151c586ed24a959f4098645ab623: CDI devices from CRI Config.CDIDevices: []" Jan 28 01:15:21.386561 containerd[1674]: time="2026-01-28T01:15:21.386509608Z" level=info msg="CreateContainer within sandbox \"3ad4aab15ad1bec082c373d2e01cdc315cb022af293343f5b7ab0212d77df4ce\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"2374a2656d3ad6cdf412683953a7f092a77a151c586ed24a959f4098645ab623\"" Jan 28 01:15:21.387248 containerd[1674]: time="2026-01-28T01:15:21.387225605Z" level=info msg="StartContainer for \"2374a2656d3ad6cdf412683953a7f092a77a151c586ed24a959f4098645ab623\"" Jan 28 01:15:21.388370 containerd[1674]: time="2026-01-28T01:15:21.388346639Z" level=info msg="connecting to shim 2374a2656d3ad6cdf412683953a7f092a77a151c586ed24a959f4098645ab623" address="unix:///run/containerd/s/76156d2b767e5dc73bde3a7de5908e721044acbed11f7f52a9455991f54d77d1" protocol=ttrpc version=3 Jan 28 01:15:21.410239 systemd[1]: Started cri-containerd-2374a2656d3ad6cdf412683953a7f092a77a151c586ed24a959f4098645ab623.scope - libcontainer container 2374a2656d3ad6cdf412683953a7f092a77a151c586ed24a959f4098645ab623. Jan 28 01:15:21.420000 audit: BPF prog-id=146 op=LOAD Jan 28 01:15:21.421000 audit: BPF prog-id=147 op=LOAD Jan 28 01:15:21.421000 audit[3222]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=3091 pid=3222 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:21.421000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3233373461323635366433616436636466343132363833393533613766 Jan 28 01:15:21.421000 audit: BPF prog-id=147 op=UNLOAD Jan 28 01:15:21.421000 audit[3222]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3091 pid=3222 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:21.421000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3233373461323635366433616436636466343132363833393533613766 Jan 28 01:15:21.421000 audit: BPF prog-id=148 op=LOAD Jan 28 01:15:21.421000 audit[3222]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=3091 pid=3222 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:21.421000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3233373461323635366433616436636466343132363833393533613766 Jan 28 01:15:21.421000 audit: BPF prog-id=149 op=LOAD Jan 28 01:15:21.421000 audit[3222]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=3091 pid=3222 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:21.421000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3233373461323635366433616436636466343132363833393533613766 Jan 28 01:15:21.421000 audit: BPF prog-id=149 op=UNLOAD Jan 28 01:15:21.421000 audit[3222]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3091 pid=3222 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:21.421000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3233373461323635366433616436636466343132363833393533613766 Jan 28 01:15:21.421000 audit: BPF prog-id=148 op=UNLOAD Jan 28 01:15:21.421000 audit[3222]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3091 pid=3222 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:21.421000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3233373461323635366433616436636466343132363833393533613766 Jan 28 01:15:21.421000 audit: BPF prog-id=150 op=LOAD Jan 28 01:15:21.421000 audit[3222]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=3091 pid=3222 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:21.421000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3233373461323635366433616436636466343132363833393533613766 Jan 28 01:15:21.441846 containerd[1674]: time="2026-01-28T01:15:21.441808767Z" level=info msg="StartContainer for \"2374a2656d3ad6cdf412683953a7f092a77a151c586ed24a959f4098645ab623\" returns successfully" Jan 28 01:15:21.834430 kubelet[2910]: I0128 01:15:21.833918 2910 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-7dcd859c48-h6v6g" podStartSLOduration=1.701724829 podStartE2EDuration="3.833895526s" podCreationTimestamp="2026-01-28 01:15:18 +0000 UTC" firstStartedPulling="2026-01-28 01:15:19.226071471 +0000 UTC m=+6.581922916" lastFinishedPulling="2026-01-28 01:15:21.35824217 +0000 UTC m=+8.714093613" observedRunningTime="2026-01-28 01:15:21.833583721 +0000 UTC m=+9.189435222" watchObservedRunningTime="2026-01-28 01:15:21.833895526 +0000 UTC m=+9.189747042" Jan 28 01:15:27.111234 sudo[1946]: pam_unix(sudo:session): session closed for user root Jan 28 01:15:27.116870 kernel: kauditd_printk_skb: 224 callbacks suppressed Jan 28 01:15:27.116995 kernel: audit: type=1106 audit(1769562927.110:522): pid=1946 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 28 01:15:27.117040 kernel: audit: type=1104 audit(1769562927.110:523): pid=1946 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 28 01:15:27.110000 audit[1946]: USER_END pid=1946 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 28 01:15:27.110000 audit[1946]: CRED_DISP pid=1946 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 28 01:15:27.206113 sshd[1945]: Connection closed by 20.161.92.111 port 36952 Jan 28 01:15:27.207194 sshd-session[1941]: pam_unix(sshd:session): session closed for user core Jan 28 01:15:27.208000 audit[1941]: USER_END pid=1941 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:15:27.215022 kernel: audit: type=1106 audit(1769562927.208:524): pid=1941 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:15:27.217131 systemd[1]: sshd@6-10.0.0.143:22-20.161.92.111:36952.service: Deactivated successfully. Jan 28 01:15:27.213000 audit[1941]: CRED_DISP pid=1941 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:15:27.223580 systemd[1]: session-8.scope: Deactivated successfully. Jan 28 01:15:27.223874 systemd[1]: session-8.scope: Consumed 4.748s CPU time, 232.1M memory peak. Jan 28 01:15:27.225046 kernel: audit: type=1104 audit(1769562927.213:525): pid=1941 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:15:27.216000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-10.0.0.143:22-20.161.92.111:36952 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:15:27.229785 systemd-logind[1658]: Session 8 logged out. Waiting for processes to exit. Jan 28 01:15:27.230025 kernel: audit: type=1131 audit(1769562927.216:526): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-10.0.0.143:22-20.161.92.111:36952 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:15:27.231535 systemd-logind[1658]: Removed session 8. Jan 28 01:15:28.116000 audit[3304]: NETFILTER_CFG table=filter:105 family=2 entries=15 op=nft_register_rule pid=3304 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:15:28.121102 kernel: audit: type=1325 audit(1769562928.116:527): table=filter:105 family=2 entries=15 op=nft_register_rule pid=3304 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:15:28.121175 kernel: audit: type=1300 audit(1769562928.116:527): arch=c000003e syscall=46 success=yes exit=5992 a0=3 a1=7ffce3e0ebd0 a2=0 a3=7ffce3e0ebbc items=0 ppid=3019 pid=3304 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:28.116000 audit[3304]: SYSCALL arch=c000003e syscall=46 success=yes exit=5992 a0=3 a1=7ffce3e0ebd0 a2=0 a3=7ffce3e0ebbc items=0 ppid=3019 pid=3304 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:28.116000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:15:28.132031 kernel: audit: type=1327 audit(1769562928.116:527): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:15:28.120000 audit[3304]: NETFILTER_CFG table=nat:106 family=2 entries=12 op=nft_register_rule pid=3304 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:15:28.136026 kernel: audit: type=1325 audit(1769562928.120:528): table=nat:106 family=2 entries=12 op=nft_register_rule pid=3304 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:15:28.120000 audit[3304]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffce3e0ebd0 a2=0 a3=0 items=0 ppid=3019 pid=3304 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:28.142048 kernel: audit: type=1300 audit(1769562928.120:528): arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffce3e0ebd0 a2=0 a3=0 items=0 ppid=3019 pid=3304 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:28.120000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:15:28.147000 audit[3306]: NETFILTER_CFG table=filter:107 family=2 entries=16 op=nft_register_rule pid=3306 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:15:28.147000 audit[3306]: SYSCALL arch=c000003e syscall=46 success=yes exit=5992 a0=3 a1=7ffe35919900 a2=0 a3=7ffe359198ec items=0 ppid=3019 pid=3306 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:28.147000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:15:28.152000 audit[3306]: NETFILTER_CFG table=nat:108 family=2 entries=12 op=nft_register_rule pid=3306 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:15:28.152000 audit[3306]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffe35919900 a2=0 a3=0 items=0 ppid=3019 pid=3306 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:28.152000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:15:31.309000 audit[3308]: NETFILTER_CFG table=filter:109 family=2 entries=17 op=nft_register_rule pid=3308 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:15:31.309000 audit[3308]: SYSCALL arch=c000003e syscall=46 success=yes exit=6736 a0=3 a1=7ffd4a681d00 a2=0 a3=7ffd4a681cec items=0 ppid=3019 pid=3308 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:31.309000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:15:31.314000 audit[3308]: NETFILTER_CFG table=nat:110 family=2 entries=12 op=nft_register_rule pid=3308 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:15:31.314000 audit[3308]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffd4a681d00 a2=0 a3=0 items=0 ppid=3019 pid=3308 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:31.314000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:15:31.324000 audit[3310]: NETFILTER_CFG table=filter:111 family=2 entries=18 op=nft_register_rule pid=3310 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:15:31.324000 audit[3310]: SYSCALL arch=c000003e syscall=46 success=yes exit=6736 a0=3 a1=7ffe49931c30 a2=0 a3=7ffe49931c1c items=0 ppid=3019 pid=3310 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:31.324000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:15:31.328000 audit[3310]: NETFILTER_CFG table=nat:112 family=2 entries=12 op=nft_register_rule pid=3310 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:15:31.328000 audit[3310]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffe49931c30 a2=0 a3=0 items=0 ppid=3019 pid=3310 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:31.328000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:15:32.373000 audit[3312]: NETFILTER_CFG table=filter:113 family=2 entries=19 op=nft_register_rule pid=3312 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:15:32.375483 kernel: kauditd_printk_skb: 19 callbacks suppressed Jan 28 01:15:32.375608 kernel: audit: type=1325 audit(1769562932.373:535): table=filter:113 family=2 entries=19 op=nft_register_rule pid=3312 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:15:32.373000 audit[3312]: SYSCALL arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7fff7e9de3b0 a2=0 a3=7fff7e9de39c items=0 ppid=3019 pid=3312 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:32.380095 kernel: audit: type=1300 audit(1769562932.373:535): arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7fff7e9de3b0 a2=0 a3=7fff7e9de39c items=0 ppid=3019 pid=3312 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:32.373000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:15:32.384059 kernel: audit: type=1327 audit(1769562932.373:535): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:15:32.385000 audit[3312]: NETFILTER_CFG table=nat:114 family=2 entries=12 op=nft_register_rule pid=3312 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:15:32.393954 kernel: audit: type=1325 audit(1769562932.385:536): table=nat:114 family=2 entries=12 op=nft_register_rule pid=3312 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:15:32.394078 kernel: audit: type=1300 audit(1769562932.385:536): arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7fff7e9de3b0 a2=0 a3=0 items=0 ppid=3019 pid=3312 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:32.385000 audit[3312]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7fff7e9de3b0 a2=0 a3=0 items=0 ppid=3019 pid=3312 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:32.385000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:15:32.397173 kernel: audit: type=1327 audit(1769562932.385:536): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:15:33.207602 systemd[1]: Created slice kubepods-besteffort-poda4c61b24_9247_490a_ad3d_34d9dd85a479.slice - libcontainer container kubepods-besteffort-poda4c61b24_9247_490a_ad3d_34d9dd85a479.slice. Jan 28 01:15:33.278468 kubelet[2910]: I0128 01:15:33.278364 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljr9c\" (UniqueName: \"kubernetes.io/projected/a4c61b24-9247-490a-ad3d-34d9dd85a479-kube-api-access-ljr9c\") pod \"calico-typha-5c85f49c8d-4v98m\" (UID: \"a4c61b24-9247-490a-ad3d-34d9dd85a479\") " pod="calico-system/calico-typha-5c85f49c8d-4v98m" Jan 28 01:15:33.278468 kubelet[2910]: I0128 01:15:33.278463 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/a4c61b24-9247-490a-ad3d-34d9dd85a479-typha-certs\") pod \"calico-typha-5c85f49c8d-4v98m\" (UID: \"a4c61b24-9247-490a-ad3d-34d9dd85a479\") " pod="calico-system/calico-typha-5c85f49c8d-4v98m" Jan 28 01:15:33.278960 kubelet[2910]: I0128 01:15:33.278516 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a4c61b24-9247-490a-ad3d-34d9dd85a479-tigera-ca-bundle\") pod \"calico-typha-5c85f49c8d-4v98m\" (UID: \"a4c61b24-9247-490a-ad3d-34d9dd85a479\") " pod="calico-system/calico-typha-5c85f49c8d-4v98m" Jan 28 01:15:33.361692 systemd[1]: Created slice kubepods-besteffort-podd4240574_dfab_47cb_8cbc_9a03c181cf95.slice - libcontainer container kubepods-besteffort-podd4240574_dfab_47cb_8cbc_9a03c181cf95.slice. Jan 28 01:15:33.379623 kubelet[2910]: I0128 01:15:33.379556 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/d4240574-dfab-47cb-8cbc-9a03c181cf95-var-lib-calico\") pod \"calico-node-m9j58\" (UID: \"d4240574-dfab-47cb-8cbc-9a03c181cf95\") " pod="calico-system/calico-node-m9j58" Jan 28 01:15:33.379918 kubelet[2910]: I0128 01:15:33.379808 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/d4240574-dfab-47cb-8cbc-9a03c181cf95-flexvol-driver-host\") pod \"calico-node-m9j58\" (UID: \"d4240574-dfab-47cb-8cbc-9a03c181cf95\") " pod="calico-system/calico-node-m9j58" Jan 28 01:15:33.379918 kubelet[2910]: I0128 01:15:33.379829 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/d4240574-dfab-47cb-8cbc-9a03c181cf95-cni-net-dir\") pod \"calico-node-m9j58\" (UID: \"d4240574-dfab-47cb-8cbc-9a03c181cf95\") " pod="calico-system/calico-node-m9j58" Jan 28 01:15:33.380917 kubelet[2910]: I0128 01:15:33.380042 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/d4240574-dfab-47cb-8cbc-9a03c181cf95-xtables-lock\") pod \"calico-node-m9j58\" (UID: \"d4240574-dfab-47cb-8cbc-9a03c181cf95\") " pod="calico-system/calico-node-m9j58" Jan 28 01:15:33.380917 kubelet[2910]: I0128 01:15:33.380075 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/d4240574-dfab-47cb-8cbc-9a03c181cf95-cni-bin-dir\") pod \"calico-node-m9j58\" (UID: \"d4240574-dfab-47cb-8cbc-9a03c181cf95\") " pod="calico-system/calico-node-m9j58" Jan 28 01:15:33.380917 kubelet[2910]: I0128 01:15:33.380090 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/d4240574-dfab-47cb-8cbc-9a03c181cf95-cni-log-dir\") pod \"calico-node-m9j58\" (UID: \"d4240574-dfab-47cb-8cbc-9a03c181cf95\") " pod="calico-system/calico-node-m9j58" Jan 28 01:15:33.380917 kubelet[2910]: I0128 01:15:33.380106 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d4240574-dfab-47cb-8cbc-9a03c181cf95-lib-modules\") pod \"calico-node-m9j58\" (UID: \"d4240574-dfab-47cb-8cbc-9a03c181cf95\") " pod="calico-system/calico-node-m9j58" Jan 28 01:15:33.380917 kubelet[2910]: I0128 01:15:33.380135 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4240574-dfab-47cb-8cbc-9a03c181cf95-tigera-ca-bundle\") pod \"calico-node-m9j58\" (UID: \"d4240574-dfab-47cb-8cbc-9a03c181cf95\") " pod="calico-system/calico-node-m9j58" Jan 28 01:15:33.381519 kubelet[2910]: I0128 01:15:33.380155 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/d4240574-dfab-47cb-8cbc-9a03c181cf95-node-certs\") pod \"calico-node-m9j58\" (UID: \"d4240574-dfab-47cb-8cbc-9a03c181cf95\") " pod="calico-system/calico-node-m9j58" Jan 28 01:15:33.381519 kubelet[2910]: I0128 01:15:33.380170 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/d4240574-dfab-47cb-8cbc-9a03c181cf95-policysync\") pod \"calico-node-m9j58\" (UID: \"d4240574-dfab-47cb-8cbc-9a03c181cf95\") " pod="calico-system/calico-node-m9j58" Jan 28 01:15:33.381519 kubelet[2910]: I0128 01:15:33.380186 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/d4240574-dfab-47cb-8cbc-9a03c181cf95-var-run-calico\") pod \"calico-node-m9j58\" (UID: \"d4240574-dfab-47cb-8cbc-9a03c181cf95\") " pod="calico-system/calico-node-m9j58" Jan 28 01:15:33.381519 kubelet[2910]: I0128 01:15:33.380202 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8bdm\" (UniqueName: \"kubernetes.io/projected/d4240574-dfab-47cb-8cbc-9a03c181cf95-kube-api-access-g8bdm\") pod \"calico-node-m9j58\" (UID: \"d4240574-dfab-47cb-8cbc-9a03c181cf95\") " pod="calico-system/calico-node-m9j58" Jan 28 01:15:33.408000 audit[3316]: NETFILTER_CFG table=filter:115 family=2 entries=21 op=nft_register_rule pid=3316 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:15:33.408000 audit[3316]: SYSCALL arch=c000003e syscall=46 success=yes exit=8224 a0=3 a1=7fff51834c70 a2=0 a3=7fff51834c5c items=0 ppid=3019 pid=3316 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:33.414796 kernel: audit: type=1325 audit(1769562933.408:537): table=filter:115 family=2 entries=21 op=nft_register_rule pid=3316 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:15:33.414853 kernel: audit: type=1300 audit(1769562933.408:537): arch=c000003e syscall=46 success=yes exit=8224 a0=3 a1=7fff51834c70 a2=0 a3=7fff51834c5c items=0 ppid=3019 pid=3316 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:33.408000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:15:33.418788 kernel: audit: type=1327 audit(1769562933.408:537): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:15:33.411000 audit[3316]: NETFILTER_CFG table=nat:116 family=2 entries=12 op=nft_register_rule pid=3316 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:15:33.421444 kernel: audit: type=1325 audit(1769562933.411:538): table=nat:116 family=2 entries=12 op=nft_register_rule pid=3316 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:15:33.411000 audit[3316]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7fff51834c70 a2=0 a3=0 items=0 ppid=3019 pid=3316 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:33.411000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:15:33.490368 kubelet[2910]: E0128 01:15:33.490228 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.490368 kubelet[2910]: W0128 01:15:33.490262 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.490591 kubelet[2910]: E0128 01:15:33.490551 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.498317 kubelet[2910]: E0128 01:15:33.498286 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.498317 kubelet[2910]: W0128 01:15:33.498310 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.498481 kubelet[2910]: E0128 01:15:33.498332 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.513265 containerd[1674]: time="2026-01-28T01:15:33.513164710Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5c85f49c8d-4v98m,Uid:a4c61b24-9247-490a-ad3d-34d9dd85a479,Namespace:calico-system,Attempt:0,}" Jan 28 01:15:33.558771 containerd[1674]: time="2026-01-28T01:15:33.558713973Z" level=info msg="connecting to shim 146504ff689e90043ef8055ee6f9700149e2c205715c626fe0f93773a4fca2a9" address="unix:///run/containerd/s/28a5f0b4073d6b0684777b64f9c1c9dee5d6f01a2b9bbb3f8d709d47acaedfd1" namespace=k8s.io protocol=ttrpc version=3 Jan 28 01:15:33.567128 kubelet[2910]: E0128 01:15:33.566791 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-lkr4f" podUID="e60412d5-27c3-4569-9b64-5743c10cc437" Jan 28 01:15:33.568134 kubelet[2910]: E0128 01:15:33.567855 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.568134 kubelet[2910]: W0128 01:15:33.567867 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.568134 kubelet[2910]: E0128 01:15:33.567883 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.568214 kubelet[2910]: E0128 01:15:33.568182 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.568214 kubelet[2910]: W0128 01:15:33.568189 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.568214 kubelet[2910]: E0128 01:15:33.568199 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.568952 kubelet[2910]: E0128 01:15:33.568890 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.568952 kubelet[2910]: W0128 01:15:33.568903 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.568952 kubelet[2910]: E0128 01:15:33.568913 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.569209 kubelet[2910]: E0128 01:15:33.569121 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.569209 kubelet[2910]: W0128 01:15:33.569126 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.569209 kubelet[2910]: E0128 01:15:33.569134 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.569372 kubelet[2910]: E0128 01:15:33.569288 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.569372 kubelet[2910]: W0128 01:15:33.569294 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.569372 kubelet[2910]: E0128 01:15:33.569300 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.569518 kubelet[2910]: E0128 01:15:33.569429 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.569518 kubelet[2910]: W0128 01:15:33.569434 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.569518 kubelet[2910]: E0128 01:15:33.569440 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.569653 kubelet[2910]: E0128 01:15:33.569563 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.569653 kubelet[2910]: W0128 01:15:33.569568 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.569653 kubelet[2910]: E0128 01:15:33.569573 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.569859 kubelet[2910]: E0128 01:15:33.569822 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.569859 kubelet[2910]: W0128 01:15:33.569829 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.569859 kubelet[2910]: E0128 01:15:33.569836 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.570048 kubelet[2910]: E0128 01:15:33.570037 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.570048 kubelet[2910]: W0128 01:15:33.570047 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.570104 kubelet[2910]: E0128 01:15:33.570055 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.570410 kubelet[2910]: E0128 01:15:33.570265 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.570410 kubelet[2910]: W0128 01:15:33.570271 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.570410 kubelet[2910]: E0128 01:15:33.570278 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.571187 kubelet[2910]: E0128 01:15:33.571173 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.571187 kubelet[2910]: W0128 01:15:33.571185 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.571263 kubelet[2910]: E0128 01:15:33.571195 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.572454 kubelet[2910]: E0128 01:15:33.572354 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.572454 kubelet[2910]: W0128 01:15:33.572370 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.572454 kubelet[2910]: E0128 01:15:33.572381 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.572951 kubelet[2910]: E0128 01:15:33.572940 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.573120 kubelet[2910]: W0128 01:15:33.573032 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.573120 kubelet[2910]: E0128 01:15:33.573045 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.574200 kubelet[2910]: E0128 01:15:33.574109 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.574200 kubelet[2910]: W0128 01:15:33.574121 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.574200 kubelet[2910]: E0128 01:15:33.574130 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.575147 kubelet[2910]: E0128 01:15:33.575048 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.575147 kubelet[2910]: W0128 01:15:33.575058 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.575147 kubelet[2910]: E0128 01:15:33.575074 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.575294 kubelet[2910]: E0128 01:15:33.575287 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.575416 kubelet[2910]: W0128 01:15:33.575323 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.575416 kubelet[2910]: E0128 01:15:33.575332 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.575756 kubelet[2910]: E0128 01:15:33.575675 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.575756 kubelet[2910]: W0128 01:15:33.575683 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.575756 kubelet[2910]: E0128 01:15:33.575691 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.577227 kubelet[2910]: E0128 01:15:33.577076 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.577227 kubelet[2910]: W0128 01:15:33.577086 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.577227 kubelet[2910]: E0128 01:15:33.577096 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.577446 kubelet[2910]: E0128 01:15:33.577374 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.577446 kubelet[2910]: W0128 01:15:33.577382 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.577446 kubelet[2910]: E0128 01:15:33.577389 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.577549 kubelet[2910]: E0128 01:15:33.577543 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.577583 kubelet[2910]: W0128 01:15:33.577578 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.577665 kubelet[2910]: E0128 01:15:33.577612 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.582177 kubelet[2910]: E0128 01:15:33.582151 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.582177 kubelet[2910]: W0128 01:15:33.582166 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.582177 kubelet[2910]: E0128 01:15:33.582177 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.582431 kubelet[2910]: I0128 01:15:33.582201 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e60412d5-27c3-4569-9b64-5743c10cc437-registration-dir\") pod \"csi-node-driver-lkr4f\" (UID: \"e60412d5-27c3-4569-9b64-5743c10cc437\") " pod="calico-system/csi-node-driver-lkr4f" Jan 28 01:15:33.582431 kubelet[2910]: E0128 01:15:33.582364 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.582431 kubelet[2910]: W0128 01:15:33.582371 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.582431 kubelet[2910]: E0128 01:15:33.582379 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.582431 kubelet[2910]: I0128 01:15:33.582397 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/e60412d5-27c3-4569-9b64-5743c10cc437-varrun\") pod \"csi-node-driver-lkr4f\" (UID: \"e60412d5-27c3-4569-9b64-5743c10cc437\") " pod="calico-system/csi-node-driver-lkr4f" Jan 28 01:15:33.582610 kubelet[2910]: E0128 01:15:33.582524 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.582610 kubelet[2910]: W0128 01:15:33.582533 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.582610 kubelet[2910]: E0128 01:15:33.582540 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.582610 kubelet[2910]: I0128 01:15:33.582552 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e60412d5-27c3-4569-9b64-5743c10cc437-kubelet-dir\") pod \"csi-node-driver-lkr4f\" (UID: \"e60412d5-27c3-4569-9b64-5743c10cc437\") " pod="calico-system/csi-node-driver-lkr4f" Jan 28 01:15:33.582874 kubelet[2910]: E0128 01:15:33.582716 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.582874 kubelet[2910]: W0128 01:15:33.582723 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.582874 kubelet[2910]: E0128 01:15:33.582730 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.582874 kubelet[2910]: I0128 01:15:33.582742 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e60412d5-27c3-4569-9b64-5743c10cc437-socket-dir\") pod \"csi-node-driver-lkr4f\" (UID: \"e60412d5-27c3-4569-9b64-5743c10cc437\") " pod="calico-system/csi-node-driver-lkr4f" Jan 28 01:15:33.583713 kubelet[2910]: E0128 01:15:33.583051 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.583713 kubelet[2910]: W0128 01:15:33.583060 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.583713 kubelet[2910]: E0128 01:15:33.583068 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.583713 kubelet[2910]: I0128 01:15:33.583082 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgfdf\" (UniqueName: \"kubernetes.io/projected/e60412d5-27c3-4569-9b64-5743c10cc437-kube-api-access-zgfdf\") pod \"csi-node-driver-lkr4f\" (UID: \"e60412d5-27c3-4569-9b64-5743c10cc437\") " pod="calico-system/csi-node-driver-lkr4f" Jan 28 01:15:33.584924 kubelet[2910]: E0128 01:15:33.583951 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.584924 kubelet[2910]: W0128 01:15:33.583963 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.584924 kubelet[2910]: E0128 01:15:33.583972 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.584924 kubelet[2910]: E0128 01:15:33.584357 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.584924 kubelet[2910]: W0128 01:15:33.584365 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.584924 kubelet[2910]: E0128 01:15:33.584372 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.584924 kubelet[2910]: E0128 01:15:33.584502 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.584924 kubelet[2910]: W0128 01:15:33.584604 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.584924 kubelet[2910]: E0128 01:15:33.584613 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.584924 kubelet[2910]: E0128 01:15:33.584812 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.585237 kubelet[2910]: W0128 01:15:33.584817 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.585237 kubelet[2910]: E0128 01:15:33.584824 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.585237 kubelet[2910]: E0128 01:15:33.585040 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.585237 kubelet[2910]: W0128 01:15:33.585045 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.585237 kubelet[2910]: E0128 01:15:33.585051 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.585521 kubelet[2910]: E0128 01:15:33.585399 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.585521 kubelet[2910]: W0128 01:15:33.585413 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.585521 kubelet[2910]: E0128 01:15:33.585419 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.586022 kubelet[2910]: E0128 01:15:33.585895 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.586022 kubelet[2910]: W0128 01:15:33.585905 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.586022 kubelet[2910]: E0128 01:15:33.585912 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.586448 kubelet[2910]: E0128 01:15:33.586438 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.586448 kubelet[2910]: W0128 01:15:33.586446 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.586732 kubelet[2910]: E0128 01:15:33.586452 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.586732 kubelet[2910]: E0128 01:15:33.586567 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.586732 kubelet[2910]: W0128 01:15:33.586573 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.586732 kubelet[2910]: E0128 01:15:33.586578 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.586732 kubelet[2910]: E0128 01:15:33.586690 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.586732 kubelet[2910]: W0128 01:15:33.586694 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.586732 kubelet[2910]: E0128 01:15:33.586699 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.602244 systemd[1]: Started cri-containerd-146504ff689e90043ef8055ee6f9700149e2c205715c626fe0f93773a4fca2a9.scope - libcontainer container 146504ff689e90043ef8055ee6f9700149e2c205715c626fe0f93773a4fca2a9. Jan 28 01:15:33.616000 audit: BPF prog-id=151 op=LOAD Jan 28 01:15:33.617000 audit: BPF prog-id=152 op=LOAD Jan 28 01:15:33.617000 audit[3367]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=3330 pid=3367 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:33.617000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3134363530346666363839653930303433656638303535656536663937 Jan 28 01:15:33.617000 audit: BPF prog-id=152 op=UNLOAD Jan 28 01:15:33.617000 audit[3367]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3330 pid=3367 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:33.617000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3134363530346666363839653930303433656638303535656536663937 Jan 28 01:15:33.617000 audit: BPF prog-id=153 op=LOAD Jan 28 01:15:33.617000 audit[3367]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=3330 pid=3367 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:33.617000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3134363530346666363839653930303433656638303535656536663937 Jan 28 01:15:33.617000 audit: BPF prog-id=154 op=LOAD Jan 28 01:15:33.617000 audit[3367]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=3330 pid=3367 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:33.617000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3134363530346666363839653930303433656638303535656536663937 Jan 28 01:15:33.617000 audit: BPF prog-id=154 op=UNLOAD Jan 28 01:15:33.617000 audit[3367]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3330 pid=3367 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:33.617000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3134363530346666363839653930303433656638303535656536663937 Jan 28 01:15:33.618000 audit: BPF prog-id=153 op=UNLOAD Jan 28 01:15:33.618000 audit[3367]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3330 pid=3367 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:33.618000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3134363530346666363839653930303433656638303535656536663937 Jan 28 01:15:33.618000 audit: BPF prog-id=155 op=LOAD Jan 28 01:15:33.618000 audit[3367]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=3330 pid=3367 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:33.618000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3134363530346666363839653930303433656638303535656536663937 Jan 28 01:15:33.669334 containerd[1674]: time="2026-01-28T01:15:33.669297278Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-m9j58,Uid:d4240574-dfab-47cb-8cbc-9a03c181cf95,Namespace:calico-system,Attempt:0,}" Jan 28 01:15:33.678687 containerd[1674]: time="2026-01-28T01:15:33.678635835Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5c85f49c8d-4v98m,Uid:a4c61b24-9247-490a-ad3d-34d9dd85a479,Namespace:calico-system,Attempt:0,} returns sandbox id \"146504ff689e90043ef8055ee6f9700149e2c205715c626fe0f93773a4fca2a9\"" Jan 28 01:15:33.679892 containerd[1674]: time="2026-01-28T01:15:33.679861824Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\"" Jan 28 01:15:33.684443 kubelet[2910]: E0128 01:15:33.684416 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.684443 kubelet[2910]: W0128 01:15:33.684435 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.684575 kubelet[2910]: E0128 01:15:33.684453 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.684648 kubelet[2910]: E0128 01:15:33.684639 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.684648 kubelet[2910]: W0128 01:15:33.684647 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.684705 kubelet[2910]: E0128 01:15:33.684655 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.684850 kubelet[2910]: E0128 01:15:33.684784 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.684850 kubelet[2910]: W0128 01:15:33.684792 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.684850 kubelet[2910]: E0128 01:15:33.684798 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.685041 kubelet[2910]: E0128 01:15:33.685031 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.685224 kubelet[2910]: W0128 01:15:33.685040 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.685224 kubelet[2910]: E0128 01:15:33.685053 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.685224 kubelet[2910]: E0128 01:15:33.685183 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.685224 kubelet[2910]: W0128 01:15:33.685188 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.685224 kubelet[2910]: E0128 01:15:33.685195 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.685486 kubelet[2910]: E0128 01:15:33.685321 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.685486 kubelet[2910]: W0128 01:15:33.685326 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.685486 kubelet[2910]: E0128 01:15:33.685332 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.685486 kubelet[2910]: E0128 01:15:33.685482 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.685486 kubelet[2910]: W0128 01:15:33.685488 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.685756 kubelet[2910]: E0128 01:15:33.685494 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.685756 kubelet[2910]: E0128 01:15:33.685616 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.685756 kubelet[2910]: W0128 01:15:33.685621 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.685756 kubelet[2910]: E0128 01:15:33.685627 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.685756 kubelet[2910]: E0128 01:15:33.685724 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.685756 kubelet[2910]: W0128 01:15:33.685728 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.685756 kubelet[2910]: E0128 01:15:33.685734 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.686262 kubelet[2910]: E0128 01:15:33.685888 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.686262 kubelet[2910]: W0128 01:15:33.685893 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.686262 kubelet[2910]: E0128 01:15:33.685899 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.686262 kubelet[2910]: E0128 01:15:33.686177 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.686262 kubelet[2910]: W0128 01:15:33.686184 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.686262 kubelet[2910]: E0128 01:15:33.686191 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.686484 kubelet[2910]: E0128 01:15:33.686469 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.686484 kubelet[2910]: W0128 01:15:33.686480 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.686820 kubelet[2910]: E0128 01:15:33.686487 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.686820 kubelet[2910]: E0128 01:15:33.686697 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.686820 kubelet[2910]: W0128 01:15:33.686709 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.686820 kubelet[2910]: E0128 01:15:33.686722 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.687077 kubelet[2910]: E0128 01:15:33.686941 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.687077 kubelet[2910]: W0128 01:15:33.686949 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.687077 kubelet[2910]: E0128 01:15:33.686956 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.687812 kubelet[2910]: E0128 01:15:33.687226 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.687812 kubelet[2910]: W0128 01:15:33.687233 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.687812 kubelet[2910]: E0128 01:15:33.687242 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.687812 kubelet[2910]: E0128 01:15:33.687405 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.687812 kubelet[2910]: W0128 01:15:33.687412 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.687812 kubelet[2910]: E0128 01:15:33.687419 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.687812 kubelet[2910]: E0128 01:15:33.687604 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.687812 kubelet[2910]: W0128 01:15:33.687610 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.687812 kubelet[2910]: E0128 01:15:33.687615 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.688087 kubelet[2910]: E0128 01:15:33.688078 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.688087 kubelet[2910]: W0128 01:15:33.688086 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.688155 kubelet[2910]: E0128 01:15:33.688092 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.688244 kubelet[2910]: E0128 01:15:33.688236 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.688290 kubelet[2910]: W0128 01:15:33.688244 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.688290 kubelet[2910]: E0128 01:15:33.688250 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.688442 kubelet[2910]: E0128 01:15:33.688432 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.688442 kubelet[2910]: W0128 01:15:33.688441 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.688505 kubelet[2910]: E0128 01:15:33.688448 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.688605 kubelet[2910]: E0128 01:15:33.688596 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.688605 kubelet[2910]: W0128 01:15:33.688603 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.688662 kubelet[2910]: E0128 01:15:33.688609 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.688777 kubelet[2910]: E0128 01:15:33.688768 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.688777 kubelet[2910]: W0128 01:15:33.688777 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.688777 kubelet[2910]: E0128 01:15:33.688783 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.689060 kubelet[2910]: E0128 01:15:33.689050 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.689060 kubelet[2910]: W0128 01:15:33.689059 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.689114 kubelet[2910]: E0128 01:15:33.689066 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.689626 kubelet[2910]: E0128 01:15:33.689611 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.689626 kubelet[2910]: W0128 01:15:33.689624 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.689684 kubelet[2910]: E0128 01:15:33.689632 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.689863 kubelet[2910]: E0128 01:15:33.689846 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.689863 kubelet[2910]: W0128 01:15:33.689856 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.689863 kubelet[2910]: E0128 01:15:33.689862 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.698926 kubelet[2910]: E0128 01:15:33.698867 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:33.698926 kubelet[2910]: W0128 01:15:33.698882 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:33.698926 kubelet[2910]: E0128 01:15:33.698896 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:33.701458 containerd[1674]: time="2026-01-28T01:15:33.701390129Z" level=info msg="connecting to shim f18da8e1cfe1ee3c5f541bc71c1fa4d5088e88c06dac6e43360be833102950e3" address="unix:///run/containerd/s/477df8dcf35e8fc7bec38ae7b920240307c8f275fc87a791b9c4fb993462fded" namespace=k8s.io protocol=ttrpc version=3 Jan 28 01:15:33.731225 systemd[1]: Started cri-containerd-f18da8e1cfe1ee3c5f541bc71c1fa4d5088e88c06dac6e43360be833102950e3.scope - libcontainer container f18da8e1cfe1ee3c5f541bc71c1fa4d5088e88c06dac6e43360be833102950e3. Jan 28 01:15:33.740000 audit: BPF prog-id=156 op=LOAD Jan 28 01:15:33.740000 audit: BPF prog-id=157 op=LOAD Jan 28 01:15:33.740000 audit[3461]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=3449 pid=3461 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:33.740000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6631386461386531636665316565336335663534316263373163316661 Jan 28 01:15:33.740000 audit: BPF prog-id=157 op=UNLOAD Jan 28 01:15:33.740000 audit[3461]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3449 pid=3461 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:33.740000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6631386461386531636665316565336335663534316263373163316661 Jan 28 01:15:33.741000 audit: BPF prog-id=158 op=LOAD Jan 28 01:15:33.741000 audit[3461]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=3449 pid=3461 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:33.741000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6631386461386531636665316565336335663534316263373163316661 Jan 28 01:15:33.741000 audit: BPF prog-id=159 op=LOAD Jan 28 01:15:33.741000 audit[3461]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=3449 pid=3461 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:33.741000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6631386461386531636665316565336335663534316263373163316661 Jan 28 01:15:33.741000 audit: BPF prog-id=159 op=UNLOAD Jan 28 01:15:33.741000 audit[3461]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3449 pid=3461 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:33.741000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6631386461386531636665316565336335663534316263373163316661 Jan 28 01:15:33.741000 audit: BPF prog-id=158 op=UNLOAD Jan 28 01:15:33.741000 audit[3461]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3449 pid=3461 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:33.741000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6631386461386531636665316565336335663534316263373163316661 Jan 28 01:15:33.741000 audit: BPF prog-id=160 op=LOAD Jan 28 01:15:33.741000 audit[3461]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=3449 pid=3461 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:33.741000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6631386461386531636665316565336335663534316263373163316661 Jan 28 01:15:33.762897 containerd[1674]: time="2026-01-28T01:15:33.762856107Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-m9j58,Uid:d4240574-dfab-47cb-8cbc-9a03c181cf95,Namespace:calico-system,Attempt:0,} returns sandbox id \"f18da8e1cfe1ee3c5f541bc71c1fa4d5088e88c06dac6e43360be833102950e3\"" Jan 28 01:15:34.748236 kubelet[2910]: E0128 01:15:34.748186 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-lkr4f" podUID="e60412d5-27c3-4569-9b64-5743c10cc437" Jan 28 01:15:35.221086 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1397742456.mount: Deactivated successfully. Jan 28 01:15:36.246508 containerd[1674]: time="2026-01-28T01:15:36.246454130Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:15:36.248348 containerd[1674]: time="2026-01-28T01:15:36.248323277Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.4: active requests=0, bytes read=33735893" Jan 28 01:15:36.249610 containerd[1674]: time="2026-01-28T01:15:36.249577650Z" level=info msg="ImageCreate event name:\"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:15:36.253551 containerd[1674]: time="2026-01-28T01:15:36.252831792Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:15:36.253551 containerd[1674]: time="2026-01-28T01:15:36.253330454Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.4\" with image id \"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\", size \"35234482\" in 2.573445028s" Jan 28 01:15:36.253551 containerd[1674]: time="2026-01-28T01:15:36.253356089Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\" returns image reference \"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\"" Jan 28 01:15:36.255473 containerd[1674]: time="2026-01-28T01:15:36.255370082Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\"" Jan 28 01:15:36.269472 containerd[1674]: time="2026-01-28T01:15:36.269444019Z" level=info msg="CreateContainer within sandbox \"146504ff689e90043ef8055ee6f9700149e2c205715c626fe0f93773a4fca2a9\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Jan 28 01:15:36.283833 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2404544336.mount: Deactivated successfully. Jan 28 01:15:36.285357 containerd[1674]: time="2026-01-28T01:15:36.285178498Z" level=info msg="Container 1faf0acc4cebe3dfbb3d6732ec6d55be8771db0a54de517751ff2d8d30c96cce: CDI devices from CRI Config.CDIDevices: []" Jan 28 01:15:36.298278 containerd[1674]: time="2026-01-28T01:15:36.298209755Z" level=info msg="CreateContainer within sandbox \"146504ff689e90043ef8055ee6f9700149e2c205715c626fe0f93773a4fca2a9\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"1faf0acc4cebe3dfbb3d6732ec6d55be8771db0a54de517751ff2d8d30c96cce\"" Jan 28 01:15:36.298971 containerd[1674]: time="2026-01-28T01:15:36.298945245Z" level=info msg="StartContainer for \"1faf0acc4cebe3dfbb3d6732ec6d55be8771db0a54de517751ff2d8d30c96cce\"" Jan 28 01:15:36.300944 containerd[1674]: time="2026-01-28T01:15:36.300406551Z" level=info msg="connecting to shim 1faf0acc4cebe3dfbb3d6732ec6d55be8771db0a54de517751ff2d8d30c96cce" address="unix:///run/containerd/s/28a5f0b4073d6b0684777b64f9c1c9dee5d6f01a2b9bbb3f8d709d47acaedfd1" protocol=ttrpc version=3 Jan 28 01:15:36.323199 systemd[1]: Started cri-containerd-1faf0acc4cebe3dfbb3d6732ec6d55be8771db0a54de517751ff2d8d30c96cce.scope - libcontainer container 1faf0acc4cebe3dfbb3d6732ec6d55be8771db0a54de517751ff2d8d30c96cce. Jan 28 01:15:36.333000 audit: BPF prog-id=161 op=LOAD Jan 28 01:15:36.333000 audit: BPF prog-id=162 op=LOAD Jan 28 01:15:36.333000 audit[3496]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=3330 pid=3496 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:36.333000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3166616630616363346365626533646662623364363733326563366435 Jan 28 01:15:36.333000 audit: BPF prog-id=162 op=UNLOAD Jan 28 01:15:36.333000 audit[3496]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3330 pid=3496 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:36.333000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3166616630616363346365626533646662623364363733326563366435 Jan 28 01:15:36.333000 audit: BPF prog-id=163 op=LOAD Jan 28 01:15:36.333000 audit[3496]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=3330 pid=3496 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:36.333000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3166616630616363346365626533646662623364363733326563366435 Jan 28 01:15:36.333000 audit: BPF prog-id=164 op=LOAD Jan 28 01:15:36.333000 audit[3496]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=3330 pid=3496 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:36.333000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3166616630616363346365626533646662623364363733326563366435 Jan 28 01:15:36.333000 audit: BPF prog-id=164 op=UNLOAD Jan 28 01:15:36.333000 audit[3496]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3330 pid=3496 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:36.333000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3166616630616363346365626533646662623364363733326563366435 Jan 28 01:15:36.333000 audit: BPF prog-id=163 op=UNLOAD Jan 28 01:15:36.333000 audit[3496]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3330 pid=3496 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:36.333000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3166616630616363346365626533646662623364363733326563366435 Jan 28 01:15:36.333000 audit: BPF prog-id=165 op=LOAD Jan 28 01:15:36.333000 audit[3496]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=3330 pid=3496 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:36.333000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3166616630616363346365626533646662623364363733326563366435 Jan 28 01:15:36.376563 containerd[1674]: time="2026-01-28T01:15:36.376507430Z" level=info msg="StartContainer for \"1faf0acc4cebe3dfbb3d6732ec6d55be8771db0a54de517751ff2d8d30c96cce\" returns successfully" Jan 28 01:15:36.747095 kubelet[2910]: E0128 01:15:36.747024 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-lkr4f" podUID="e60412d5-27c3-4569-9b64-5743c10cc437" Jan 28 01:15:36.869475 kubelet[2910]: I0128 01:15:36.869426 2910 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-5c85f49c8d-4v98m" podStartSLOduration=1.294533382 podStartE2EDuration="3.869411327s" podCreationTimestamp="2026-01-28 01:15:33 +0000 UTC" firstStartedPulling="2026-01-28 01:15:33.679660898 +0000 UTC m=+21.035512343" lastFinishedPulling="2026-01-28 01:15:36.254538843 +0000 UTC m=+23.610390288" observedRunningTime="2026-01-28 01:15:36.868644894 +0000 UTC m=+24.224496362" watchObservedRunningTime="2026-01-28 01:15:36.869411327 +0000 UTC m=+24.225262793" Jan 28 01:15:36.895609 kubelet[2910]: E0128 01:15:36.895555 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:36.895609 kubelet[2910]: W0128 01:15:36.895582 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:36.895609 kubelet[2910]: E0128 01:15:36.895606 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:36.895804 kubelet[2910]: E0128 01:15:36.895777 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:36.895804 kubelet[2910]: W0128 01:15:36.895783 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:36.895804 kubelet[2910]: E0128 01:15:36.895800 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:36.895951 kubelet[2910]: E0128 01:15:36.895935 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:36.895951 kubelet[2910]: W0128 01:15:36.895951 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:36.895996 kubelet[2910]: E0128 01:15:36.895957 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:36.896162 kubelet[2910]: E0128 01:15:36.896152 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:36.896189 kubelet[2910]: W0128 01:15:36.896167 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:36.896189 kubelet[2910]: E0128 01:15:36.896174 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:36.896326 kubelet[2910]: E0128 01:15:36.896318 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:36.896326 kubelet[2910]: W0128 01:15:36.896326 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:36.896367 kubelet[2910]: E0128 01:15:36.896332 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:36.896467 kubelet[2910]: E0128 01:15:36.896459 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:36.896467 kubelet[2910]: W0128 01:15:36.896467 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:36.896506 kubelet[2910]: E0128 01:15:36.896472 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:36.896627 kubelet[2910]: E0128 01:15:36.896619 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:36.896627 kubelet[2910]: W0128 01:15:36.896627 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:36.896665 kubelet[2910]: E0128 01:15:36.896633 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:36.896799 kubelet[2910]: E0128 01:15:36.896789 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:36.896799 kubelet[2910]: W0128 01:15:36.896798 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:36.896843 kubelet[2910]: E0128 01:15:36.896804 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:36.896956 kubelet[2910]: E0128 01:15:36.896947 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:36.896956 kubelet[2910]: W0128 01:15:36.896955 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:36.897002 kubelet[2910]: E0128 01:15:36.896961 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:36.897112 kubelet[2910]: E0128 01:15:36.897103 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:36.897112 kubelet[2910]: W0128 01:15:36.897111 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:36.897158 kubelet[2910]: E0128 01:15:36.897117 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:36.897255 kubelet[2910]: E0128 01:15:36.897246 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:36.897255 kubelet[2910]: W0128 01:15:36.897254 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:36.897295 kubelet[2910]: E0128 01:15:36.897259 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:36.897390 kubelet[2910]: E0128 01:15:36.897382 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:36.897390 kubelet[2910]: W0128 01:15:36.897389 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:36.897430 kubelet[2910]: E0128 01:15:36.897396 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:36.897521 kubelet[2910]: E0128 01:15:36.897512 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:36.897541 kubelet[2910]: W0128 01:15:36.897520 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:36.897541 kubelet[2910]: E0128 01:15:36.897538 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:36.897665 kubelet[2910]: E0128 01:15:36.897656 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:36.897711 kubelet[2910]: W0128 01:15:36.897694 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:36.897733 kubelet[2910]: E0128 01:15:36.897704 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:36.897848 kubelet[2910]: E0128 01:15:36.897839 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:36.897848 kubelet[2910]: W0128 01:15:36.897847 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:36.897894 kubelet[2910]: E0128 01:15:36.897853 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:36.908387 kubelet[2910]: E0128 01:15:36.908354 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:36.908387 kubelet[2910]: W0128 01:15:36.908379 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:36.908387 kubelet[2910]: E0128 01:15:36.908397 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:36.908578 kubelet[2910]: E0128 01:15:36.908565 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:36.908578 kubelet[2910]: W0128 01:15:36.908574 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:36.908619 kubelet[2910]: E0128 01:15:36.908582 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:36.908735 kubelet[2910]: E0128 01:15:36.908727 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:36.908735 kubelet[2910]: W0128 01:15:36.908734 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:36.908787 kubelet[2910]: E0128 01:15:36.908741 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:36.908904 kubelet[2910]: E0128 01:15:36.908892 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:36.908904 kubelet[2910]: W0128 01:15:36.908903 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:36.908951 kubelet[2910]: E0128 01:15:36.908910 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:36.909093 kubelet[2910]: E0128 01:15:36.909084 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:36.909093 kubelet[2910]: W0128 01:15:36.909092 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:36.909141 kubelet[2910]: E0128 01:15:36.909113 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:36.909243 kubelet[2910]: E0128 01:15:36.909235 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:36.909243 kubelet[2910]: W0128 01:15:36.909243 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:36.909287 kubelet[2910]: E0128 01:15:36.909258 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:36.909442 kubelet[2910]: E0128 01:15:36.909433 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:36.909442 kubelet[2910]: W0128 01:15:36.909441 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:36.909487 kubelet[2910]: E0128 01:15:36.909448 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:36.909837 kubelet[2910]: E0128 01:15:36.909737 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:36.909837 kubelet[2910]: W0128 01:15:36.909751 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:36.909837 kubelet[2910]: E0128 01:15:36.909762 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:36.909965 kubelet[2910]: E0128 01:15:36.909958 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:36.910001 kubelet[2910]: W0128 01:15:36.909996 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:36.910046 kubelet[2910]: E0128 01:15:36.910040 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:36.910213 kubelet[2910]: E0128 01:15:36.910207 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:36.910330 kubelet[2910]: W0128 01:15:36.910251 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:36.910330 kubelet[2910]: E0128 01:15:36.910260 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:36.910412 kubelet[2910]: E0128 01:15:36.910407 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:36.910445 kubelet[2910]: W0128 01:15:36.910441 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:36.910477 kubelet[2910]: E0128 01:15:36.910472 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:36.910642 kubelet[2910]: E0128 01:15:36.910634 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:36.910680 kubelet[2910]: W0128 01:15:36.910675 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:36.910837 kubelet[2910]: E0128 01:15:36.910756 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:36.910944 kubelet[2910]: E0128 01:15:36.910938 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:36.910979 kubelet[2910]: W0128 01:15:36.910973 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:36.911028 kubelet[2910]: E0128 01:15:36.911022 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:36.911219 kubelet[2910]: E0128 01:15:36.911209 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:36.911246 kubelet[2910]: W0128 01:15:36.911219 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:36.911246 kubelet[2910]: E0128 01:15:36.911228 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:36.911368 kubelet[2910]: E0128 01:15:36.911359 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:36.911400 kubelet[2910]: W0128 01:15:36.911368 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:36.911400 kubelet[2910]: E0128 01:15:36.911374 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:36.911514 kubelet[2910]: E0128 01:15:36.911506 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:36.911514 kubelet[2910]: W0128 01:15:36.911513 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:36.911557 kubelet[2910]: E0128 01:15:36.911519 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:36.911706 kubelet[2910]: E0128 01:15:36.911697 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:36.911732 kubelet[2910]: W0128 01:15:36.911706 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:36.911732 kubelet[2910]: E0128 01:15:36.911712 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:36.911901 kubelet[2910]: E0128 01:15:36.911892 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:36.911901 kubelet[2910]: W0128 01:15:36.911900 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:36.911946 kubelet[2910]: E0128 01:15:36.911906 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:37.860958 kubelet[2910]: I0128 01:15:37.860927 2910 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 28 01:15:37.904205 kubelet[2910]: E0128 01:15:37.904163 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:37.904205 kubelet[2910]: W0128 01:15:37.904187 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:37.904205 kubelet[2910]: E0128 01:15:37.904208 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:37.904716 kubelet[2910]: E0128 01:15:37.904365 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:37.904716 kubelet[2910]: W0128 01:15:37.904371 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:37.904716 kubelet[2910]: E0128 01:15:37.904378 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:37.904716 kubelet[2910]: E0128 01:15:37.904526 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:37.904716 kubelet[2910]: W0128 01:15:37.904532 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:37.904716 kubelet[2910]: E0128 01:15:37.904539 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:37.904881 kubelet[2910]: E0128 01:15:37.904745 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:37.904881 kubelet[2910]: W0128 01:15:37.904751 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:37.904881 kubelet[2910]: E0128 01:15:37.904758 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:37.904881 kubelet[2910]: E0128 01:15:37.904872 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:37.904881 kubelet[2910]: W0128 01:15:37.904877 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:37.904881 kubelet[2910]: E0128 01:15:37.904883 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:37.905042 kubelet[2910]: E0128 01:15:37.904989 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:37.905042 kubelet[2910]: W0128 01:15:37.904994 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:37.905042 kubelet[2910]: E0128 01:15:37.905000 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:37.905466 kubelet[2910]: E0128 01:15:37.905125 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:37.905466 kubelet[2910]: W0128 01:15:37.905134 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:37.905466 kubelet[2910]: E0128 01:15:37.905141 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:37.905466 kubelet[2910]: E0128 01:15:37.905244 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:37.905466 kubelet[2910]: W0128 01:15:37.905249 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:37.905466 kubelet[2910]: E0128 01:15:37.905255 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:37.905466 kubelet[2910]: E0128 01:15:37.905381 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:37.905466 kubelet[2910]: W0128 01:15:37.905395 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:37.905466 kubelet[2910]: E0128 01:15:37.905401 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:37.906098 kubelet[2910]: E0128 01:15:37.905523 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:37.906098 kubelet[2910]: W0128 01:15:37.905529 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:37.906098 kubelet[2910]: E0128 01:15:37.905535 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:37.906098 kubelet[2910]: E0128 01:15:37.905687 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:37.906098 kubelet[2910]: W0128 01:15:37.905693 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:37.906098 kubelet[2910]: E0128 01:15:37.905699 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:37.906098 kubelet[2910]: E0128 01:15:37.905838 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:37.906098 kubelet[2910]: W0128 01:15:37.905847 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:37.906098 kubelet[2910]: E0128 01:15:37.905852 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:37.906098 kubelet[2910]: E0128 01:15:37.905990 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:37.906507 kubelet[2910]: W0128 01:15:37.905995 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:37.906507 kubelet[2910]: E0128 01:15:37.906001 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:37.906507 kubelet[2910]: E0128 01:15:37.906138 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:37.906507 kubelet[2910]: W0128 01:15:37.906144 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:37.906507 kubelet[2910]: E0128 01:15:37.906150 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:37.906507 kubelet[2910]: E0128 01:15:37.906256 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:37.906507 kubelet[2910]: W0128 01:15:37.906261 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:37.906507 kubelet[2910]: E0128 01:15:37.906267 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:37.915949 kubelet[2910]: E0128 01:15:37.915856 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:37.915949 kubelet[2910]: W0128 01:15:37.915900 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:37.915949 kubelet[2910]: E0128 01:15:37.915923 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:37.916456 kubelet[2910]: E0128 01:15:37.916315 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:37.916456 kubelet[2910]: W0128 01:15:37.916324 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:37.916456 kubelet[2910]: E0128 01:15:37.916334 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:37.916597 kubelet[2910]: E0128 01:15:37.916589 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:37.916645 kubelet[2910]: W0128 01:15:37.916639 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:37.916692 kubelet[2910]: E0128 01:15:37.916686 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:37.916962 kubelet[2910]: E0128 01:15:37.916945 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:37.917021 kubelet[2910]: W0128 01:15:37.916963 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:37.917021 kubelet[2910]: E0128 01:15:37.916976 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:37.917179 kubelet[2910]: E0128 01:15:37.917169 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:37.917205 kubelet[2910]: W0128 01:15:37.917178 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:37.917205 kubelet[2910]: E0128 01:15:37.917187 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:37.917338 kubelet[2910]: E0128 01:15:37.917329 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:37.917366 kubelet[2910]: W0128 01:15:37.917340 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:37.917366 kubelet[2910]: E0128 01:15:37.917347 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:37.917526 kubelet[2910]: E0128 01:15:37.917518 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:37.917526 kubelet[2910]: W0128 01:15:37.917526 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:37.917599 kubelet[2910]: E0128 01:15:37.917533 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:37.918366 kubelet[2910]: E0128 01:15:37.918351 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:37.918366 kubelet[2910]: W0128 01:15:37.918365 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:37.918454 kubelet[2910]: E0128 01:15:37.918376 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:37.918541 kubelet[2910]: E0128 01:15:37.918532 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:37.918571 kubelet[2910]: W0128 01:15:37.918540 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:37.918571 kubelet[2910]: E0128 01:15:37.918546 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:37.918673 kubelet[2910]: E0128 01:15:37.918663 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:37.918703 kubelet[2910]: W0128 01:15:37.918684 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:37.918703 kubelet[2910]: E0128 01:15:37.918693 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:37.918903 kubelet[2910]: E0128 01:15:37.918893 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:37.918936 kubelet[2910]: W0128 01:15:37.918903 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:37.918936 kubelet[2910]: E0128 01:15:37.918910 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:37.919079 kubelet[2910]: E0128 01:15:37.919069 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:37.919079 kubelet[2910]: W0128 01:15:37.919079 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:37.919128 kubelet[2910]: E0128 01:15:37.919094 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:37.919274 kubelet[2910]: E0128 01:15:37.919266 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:37.919274 kubelet[2910]: W0128 01:15:37.919274 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:37.919326 kubelet[2910]: E0128 01:15:37.919280 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:37.919707 kubelet[2910]: E0128 01:15:37.919608 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:37.919707 kubelet[2910]: W0128 01:15:37.919619 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:37.919707 kubelet[2910]: E0128 01:15:37.919630 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:37.919881 kubelet[2910]: E0128 01:15:37.919873 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:37.919926 kubelet[2910]: W0128 01:15:37.919919 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:37.919961 kubelet[2910]: E0128 01:15:37.919955 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:37.920293 kubelet[2910]: E0128 01:15:37.920150 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:37.920293 kubelet[2910]: W0128 01:15:37.920157 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:37.920293 kubelet[2910]: E0128 01:15:37.920164 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:37.920394 kubelet[2910]: E0128 01:15:37.920351 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:37.920394 kubelet[2910]: W0128 01:15:37.920359 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:37.920394 kubelet[2910]: E0128 01:15:37.920367 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:37.920572 kubelet[2910]: E0128 01:15:37.920562 2910 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:15:37.920572 kubelet[2910]: W0128 01:15:37.920571 2910 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:15:37.920618 kubelet[2910]: E0128 01:15:37.920579 2910 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:15:38.233594 containerd[1674]: time="2026-01-28T01:15:38.233446006Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:15:38.236217 containerd[1674]: time="2026-01-28T01:15:38.235952876Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4: active requests=0, bytes read=4442579" Jan 28 01:15:38.237870 containerd[1674]: time="2026-01-28T01:15:38.237839189Z" level=info msg="ImageCreate event name:\"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:15:38.240234 containerd[1674]: time="2026-01-28T01:15:38.240199208Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:15:38.240675 containerd[1674]: time="2026-01-28T01:15:38.240655040Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" with image id \"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\", size \"5941314\" in 1.98516854s" Jan 28 01:15:38.240746 containerd[1674]: time="2026-01-28T01:15:38.240735507Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" returns image reference \"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\"" Jan 28 01:15:38.246262 containerd[1674]: time="2026-01-28T01:15:38.246232409Z" level=info msg="CreateContainer within sandbox \"f18da8e1cfe1ee3c5f541bc71c1fa4d5088e88c06dac6e43360be833102950e3\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Jan 28 01:15:38.261024 containerd[1674]: time="2026-01-28T01:15:38.259957558Z" level=info msg="Container b4b8e077e959069c5b614169d0a4147dfb7ed148a12d688842b10d961f3dbf2c: CDI devices from CRI Config.CDIDevices: []" Jan 28 01:15:38.269805 containerd[1674]: time="2026-01-28T01:15:38.269772408Z" level=info msg="CreateContainer within sandbox \"f18da8e1cfe1ee3c5f541bc71c1fa4d5088e88c06dac6e43360be833102950e3\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"b4b8e077e959069c5b614169d0a4147dfb7ed148a12d688842b10d961f3dbf2c\"" Jan 28 01:15:38.271105 containerd[1674]: time="2026-01-28T01:15:38.271064420Z" level=info msg="StartContainer for \"b4b8e077e959069c5b614169d0a4147dfb7ed148a12d688842b10d961f3dbf2c\"" Jan 28 01:15:38.273930 containerd[1674]: time="2026-01-28T01:15:38.273899133Z" level=info msg="connecting to shim b4b8e077e959069c5b614169d0a4147dfb7ed148a12d688842b10d961f3dbf2c" address="unix:///run/containerd/s/477df8dcf35e8fc7bec38ae7b920240307c8f275fc87a791b9c4fb993462fded" protocol=ttrpc version=3 Jan 28 01:15:38.292436 systemd[1]: Started cri-containerd-b4b8e077e959069c5b614169d0a4147dfb7ed148a12d688842b10d961f3dbf2c.scope - libcontainer container b4b8e077e959069c5b614169d0a4147dfb7ed148a12d688842b10d961f3dbf2c. Jan 28 01:15:38.348000 audit: BPF prog-id=166 op=LOAD Jan 28 01:15:38.350384 kernel: kauditd_printk_skb: 68 callbacks suppressed Jan 28 01:15:38.350453 kernel: audit: type=1334 audit(1769562938.348:563): prog-id=166 op=LOAD Jan 28 01:15:38.348000 audit[3603]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000106488 a2=98 a3=0 items=0 ppid=3449 pid=3603 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:38.354084 kernel: audit: type=1300 audit(1769562938.348:563): arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000106488 a2=98 a3=0 items=0 ppid=3449 pid=3603 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:38.348000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6234623865303737653935393036396335623631343136396430613431 Jan 28 01:15:38.358939 kernel: audit: type=1327 audit(1769562938.348:563): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6234623865303737653935393036396335623631343136396430613431 Jan 28 01:15:38.362399 kernel: audit: type=1334 audit(1769562938.348:564): prog-id=167 op=LOAD Jan 28 01:15:38.348000 audit: BPF prog-id=167 op=LOAD Jan 28 01:15:38.348000 audit[3603]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000106218 a2=98 a3=0 items=0 ppid=3449 pid=3603 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:38.365214 kernel: audit: type=1300 audit(1769562938.348:564): arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000106218 a2=98 a3=0 items=0 ppid=3449 pid=3603 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:38.348000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6234623865303737653935393036396335623631343136396430613431 Jan 28 01:15:38.369693 kernel: audit: type=1327 audit(1769562938.348:564): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6234623865303737653935393036396335623631343136396430613431 Jan 28 01:15:38.348000 audit: BPF prog-id=167 op=UNLOAD Jan 28 01:15:38.372626 kernel: audit: type=1334 audit(1769562938.348:565): prog-id=167 op=UNLOAD Jan 28 01:15:38.348000 audit[3603]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3449 pid=3603 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:38.375202 kernel: audit: type=1300 audit(1769562938.348:565): arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3449 pid=3603 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:38.348000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6234623865303737653935393036396335623631343136396430613431 Jan 28 01:15:38.348000 audit: BPF prog-id=166 op=UNLOAD Jan 28 01:15:38.387346 kernel: audit: type=1327 audit(1769562938.348:565): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6234623865303737653935393036396335623631343136396430613431 Jan 28 01:15:38.387399 kernel: audit: type=1334 audit(1769562938.348:566): prog-id=166 op=UNLOAD Jan 28 01:15:38.348000 audit[3603]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3449 pid=3603 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:38.348000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6234623865303737653935393036396335623631343136396430613431 Jan 28 01:15:38.348000 audit: BPF prog-id=168 op=LOAD Jan 28 01:15:38.348000 audit[3603]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001066e8 a2=98 a3=0 items=0 ppid=3449 pid=3603 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:38.348000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6234623865303737653935393036396335623631343136396430613431 Jan 28 01:15:38.395889 containerd[1674]: time="2026-01-28T01:15:38.395585585Z" level=info msg="StartContainer for \"b4b8e077e959069c5b614169d0a4147dfb7ed148a12d688842b10d961f3dbf2c\" returns successfully" Jan 28 01:15:38.409945 systemd[1]: cri-containerd-b4b8e077e959069c5b614169d0a4147dfb7ed148a12d688842b10d961f3dbf2c.scope: Deactivated successfully. Jan 28 01:15:38.412000 audit: BPF prog-id=168 op=UNLOAD Jan 28 01:15:38.416364 containerd[1674]: time="2026-01-28T01:15:38.416265759Z" level=info msg="received container exit event container_id:\"b4b8e077e959069c5b614169d0a4147dfb7ed148a12d688842b10d961f3dbf2c\" id:\"b4b8e077e959069c5b614169d0a4147dfb7ed148a12d688842b10d961f3dbf2c\" pid:3617 exited_at:{seconds:1769562938 nanos:415185587}" Jan 28 01:15:38.437250 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-b4b8e077e959069c5b614169d0a4147dfb7ed148a12d688842b10d961f3dbf2c-rootfs.mount: Deactivated successfully. Jan 28 01:15:38.747299 kubelet[2910]: E0128 01:15:38.747248 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-lkr4f" podUID="e60412d5-27c3-4569-9b64-5743c10cc437" Jan 28 01:15:38.865293 containerd[1674]: time="2026-01-28T01:15:38.865241208Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\"" Jan 28 01:15:40.748728 kubelet[2910]: E0128 01:15:40.747672 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-lkr4f" podUID="e60412d5-27c3-4569-9b64-5743c10cc437" Jan 28 01:15:42.676369 containerd[1674]: time="2026-01-28T01:15:42.676206065Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:15:42.678719 containerd[1674]: time="2026-01-28T01:15:42.678534044Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.4: active requests=0, bytes read=70442291" Jan 28 01:15:42.680364 containerd[1674]: time="2026-01-28T01:15:42.680335609Z" level=info msg="ImageCreate event name:\"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:15:42.684032 containerd[1674]: time="2026-01-28T01:15:42.683972030Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:15:42.684782 containerd[1674]: time="2026-01-28T01:15:42.684388226Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.4\" with image id \"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\", size \"71941459\" in 3.819116225s" Jan 28 01:15:42.684782 containerd[1674]: time="2026-01-28T01:15:42.684418374Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\" returns image reference \"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\"" Jan 28 01:15:42.691152 containerd[1674]: time="2026-01-28T01:15:42.691091699Z" level=info msg="CreateContainer within sandbox \"f18da8e1cfe1ee3c5f541bc71c1fa4d5088e88c06dac6e43360be833102950e3\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Jan 28 01:15:42.706598 containerd[1674]: time="2026-01-28T01:15:42.705891372Z" level=info msg="Container d7cf41af8b99e36f5ab8851545c6c8741a6d23af8aa9bed9efbaf24f427f045b: CDI devices from CRI Config.CDIDevices: []" Jan 28 01:15:42.725627 containerd[1674]: time="2026-01-28T01:15:42.725588164Z" level=info msg="CreateContainer within sandbox \"f18da8e1cfe1ee3c5f541bc71c1fa4d5088e88c06dac6e43360be833102950e3\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"d7cf41af8b99e36f5ab8851545c6c8741a6d23af8aa9bed9efbaf24f427f045b\"" Jan 28 01:15:42.726046 containerd[1674]: time="2026-01-28T01:15:42.726025808Z" level=info msg="StartContainer for \"d7cf41af8b99e36f5ab8851545c6c8741a6d23af8aa9bed9efbaf24f427f045b\"" Jan 28 01:15:42.727469 containerd[1674]: time="2026-01-28T01:15:42.727442640Z" level=info msg="connecting to shim d7cf41af8b99e36f5ab8851545c6c8741a6d23af8aa9bed9efbaf24f427f045b" address="unix:///run/containerd/s/477df8dcf35e8fc7bec38ae7b920240307c8f275fc87a791b9c4fb993462fded" protocol=ttrpc version=3 Jan 28 01:15:42.747976 kubelet[2910]: E0128 01:15:42.747158 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-lkr4f" podUID="e60412d5-27c3-4569-9b64-5743c10cc437" Jan 28 01:15:42.769208 systemd[1]: Started cri-containerd-d7cf41af8b99e36f5ab8851545c6c8741a6d23af8aa9bed9efbaf24f427f045b.scope - libcontainer container d7cf41af8b99e36f5ab8851545c6c8741a6d23af8aa9bed9efbaf24f427f045b. Jan 28 01:15:42.808000 audit: BPF prog-id=169 op=LOAD Jan 28 01:15:42.808000 audit[3663]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000106488 a2=98 a3=0 items=0 ppid=3449 pid=3663 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:42.808000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6437636634316166386239396533366635616238383531353435633663 Jan 28 01:15:42.808000 audit: BPF prog-id=170 op=LOAD Jan 28 01:15:42.808000 audit[3663]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000106218 a2=98 a3=0 items=0 ppid=3449 pid=3663 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:42.808000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6437636634316166386239396533366635616238383531353435633663 Jan 28 01:15:42.808000 audit: BPF prog-id=170 op=UNLOAD Jan 28 01:15:42.808000 audit[3663]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3449 pid=3663 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:42.808000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6437636634316166386239396533366635616238383531353435633663 Jan 28 01:15:42.808000 audit: BPF prog-id=169 op=UNLOAD Jan 28 01:15:42.808000 audit[3663]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3449 pid=3663 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:42.808000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6437636634316166386239396533366635616238383531353435633663 Jan 28 01:15:42.808000 audit: BPF prog-id=171 op=LOAD Jan 28 01:15:42.808000 audit[3663]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001066e8 a2=98 a3=0 items=0 ppid=3449 pid=3663 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:42.808000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6437636634316166386239396533366635616238383531353435633663 Jan 28 01:15:42.829424 containerd[1674]: time="2026-01-28T01:15:42.829358528Z" level=info msg="StartContainer for \"d7cf41af8b99e36f5ab8851545c6c8741a6d23af8aa9bed9efbaf24f427f045b\" returns successfully" Jan 28 01:15:43.322349 containerd[1674]: time="2026-01-28T01:15:43.322146162Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 28 01:15:43.324183 systemd[1]: cri-containerd-d7cf41af8b99e36f5ab8851545c6c8741a6d23af8aa9bed9efbaf24f427f045b.scope: Deactivated successfully. Jan 28 01:15:43.324740 systemd[1]: cri-containerd-d7cf41af8b99e36f5ab8851545c6c8741a6d23af8aa9bed9efbaf24f427f045b.scope: Consumed 450ms CPU time, 197.6M memory peak, 171.3M written to disk. Jan 28 01:15:43.326183 containerd[1674]: time="2026-01-28T01:15:43.326132881Z" level=info msg="received container exit event container_id:\"d7cf41af8b99e36f5ab8851545c6c8741a6d23af8aa9bed9efbaf24f427f045b\" id:\"d7cf41af8b99e36f5ab8851545c6c8741a6d23af8aa9bed9efbaf24f427f045b\" pid:3676 exited_at:{seconds:1769562943 nanos:325919615}" Jan 28 01:15:43.328000 audit: BPF prog-id=171 op=UNLOAD Jan 28 01:15:43.348322 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-d7cf41af8b99e36f5ab8851545c6c8741a6d23af8aa9bed9efbaf24f427f045b-rootfs.mount: Deactivated successfully. Jan 28 01:15:43.373569 kubelet[2910]: I0128 01:15:43.373546 2910 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Jan 28 01:15:43.422208 systemd[1]: Created slice kubepods-burstable-pod999bf8e1_3bd4_4ef0_8db0_79618ac53f0b.slice - libcontainer container kubepods-burstable-pod999bf8e1_3bd4_4ef0_8db0_79618ac53f0b.slice. Jan 28 01:15:43.433312 systemd[1]: Created slice kubepods-burstable-podaab2583c_2dbb_4842_965d_f4f1d01197b0.slice - libcontainer container kubepods-burstable-podaab2583c_2dbb_4842_965d_f4f1d01197b0.slice. Jan 28 01:15:43.440966 systemd[1]: Created slice kubepods-besteffort-pod72726466_f235_4a31_a84a_a3699d8c85f7.slice - libcontainer container kubepods-besteffort-pod72726466_f235_4a31_a84a_a3699d8c85f7.slice. Jan 28 01:15:43.448263 systemd[1]: Created slice kubepods-besteffort-pod7ea90b44_fc7d_4702_a1a5_1c558b3ecd80.slice - libcontainer container kubepods-besteffort-pod7ea90b44_fc7d_4702_a1a5_1c558b3ecd80.slice. Jan 28 01:15:43.454590 systemd[1]: Created slice kubepods-besteffort-pod444bc361_893d_40c2_bec0_a4908317d6e3.slice - libcontainer container kubepods-besteffort-pod444bc361_893d_40c2_bec0_a4908317d6e3.slice. Jan 28 01:15:43.455596 kubelet[2910]: I0128 01:15:43.455574 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fx89\" (UniqueName: \"kubernetes.io/projected/bead6395-8434-48df-aa67-e987782da70c-kube-api-access-7fx89\") pod \"calico-apiserver-76f584f9b9-6dh2c\" (UID: \"bead6395-8434-48df-aa67-e987782da70c\") " pod="calico-apiserver/calico-apiserver-76f584f9b9-6dh2c" Jan 28 01:15:43.455675 kubelet[2910]: I0128 01:15:43.455606 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2khj\" (UniqueName: \"kubernetes.io/projected/999bf8e1-3bd4-4ef0-8db0-79618ac53f0b-kube-api-access-s2khj\") pod \"coredns-674b8bbfcf-24fxv\" (UID: \"999bf8e1-3bd4-4ef0-8db0-79618ac53f0b\") " pod="kube-system/coredns-674b8bbfcf-24fxv" Jan 28 01:15:43.455675 kubelet[2910]: I0128 01:15:43.455622 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nnlh\" (UniqueName: \"kubernetes.io/projected/72726466-f235-4a31-a84a-a3699d8c85f7-kube-api-access-8nnlh\") pod \"calico-kube-controllers-79f9cd9ddf-ggx6z\" (UID: \"72726466-f235-4a31-a84a-a3699d8c85f7\") " pod="calico-system/calico-kube-controllers-79f9cd9ddf-ggx6z" Jan 28 01:15:43.455675 kubelet[2910]: I0128 01:15:43.455640 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c22de3ae-0a27-443f-9dd3-c4ab0a4176bd-goldmane-ca-bundle\") pod \"goldmane-666569f655-cblpt\" (UID: \"c22de3ae-0a27-443f-9dd3-c4ab0a4176bd\") " pod="calico-system/goldmane-666569f655-cblpt" Jan 28 01:15:43.455675 kubelet[2910]: I0128 01:15:43.455657 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgdzc\" (UniqueName: \"kubernetes.io/projected/7ea90b44-fc7d-4702-a1a5-1c558b3ecd80-kube-api-access-zgdzc\") pod \"calico-apiserver-76f584f9b9-9mjk9\" (UID: \"7ea90b44-fc7d-4702-a1a5-1c558b3ecd80\") " pod="calico-apiserver/calico-apiserver-76f584f9b9-9mjk9" Jan 28 01:15:43.455675 kubelet[2910]: I0128 01:15:43.455674 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/bead6395-8434-48df-aa67-e987782da70c-calico-apiserver-certs\") pod \"calico-apiserver-76f584f9b9-6dh2c\" (UID: \"bead6395-8434-48df-aa67-e987782da70c\") " pod="calico-apiserver/calico-apiserver-76f584f9b9-6dh2c" Jan 28 01:15:43.455789 kubelet[2910]: I0128 01:15:43.455690 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/c22de3ae-0a27-443f-9dd3-c4ab0a4176bd-goldmane-key-pair\") pod \"goldmane-666569f655-cblpt\" (UID: \"c22de3ae-0a27-443f-9dd3-c4ab0a4176bd\") " pod="calico-system/goldmane-666569f655-cblpt" Jan 28 01:15:43.455789 kubelet[2910]: I0128 01:15:43.455705 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/999bf8e1-3bd4-4ef0-8db0-79618ac53f0b-config-volume\") pod \"coredns-674b8bbfcf-24fxv\" (UID: \"999bf8e1-3bd4-4ef0-8db0-79618ac53f0b\") " pod="kube-system/coredns-674b8bbfcf-24fxv" Jan 28 01:15:43.455789 kubelet[2910]: I0128 01:15:43.455718 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fm2jg\" (UniqueName: \"kubernetes.io/projected/aab2583c-2dbb-4842-965d-f4f1d01197b0-kube-api-access-fm2jg\") pod \"coredns-674b8bbfcf-cxbk5\" (UID: \"aab2583c-2dbb-4842-965d-f4f1d01197b0\") " pod="kube-system/coredns-674b8bbfcf-cxbk5" Jan 28 01:15:43.455789 kubelet[2910]: I0128 01:15:43.455731 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c22de3ae-0a27-443f-9dd3-c4ab0a4176bd-config\") pod \"goldmane-666569f655-cblpt\" (UID: \"c22de3ae-0a27-443f-9dd3-c4ab0a4176bd\") " pod="calico-system/goldmane-666569f655-cblpt" Jan 28 01:15:43.455789 kubelet[2910]: I0128 01:15:43.455750 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/7ea90b44-fc7d-4702-a1a5-1c558b3ecd80-calico-apiserver-certs\") pod \"calico-apiserver-76f584f9b9-9mjk9\" (UID: \"7ea90b44-fc7d-4702-a1a5-1c558b3ecd80\") " pod="calico-apiserver/calico-apiserver-76f584f9b9-9mjk9" Jan 28 01:15:43.455893 kubelet[2910]: I0128 01:15:43.455763 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aab2583c-2dbb-4842-965d-f4f1d01197b0-config-volume\") pod \"coredns-674b8bbfcf-cxbk5\" (UID: \"aab2583c-2dbb-4842-965d-f4f1d01197b0\") " pod="kube-system/coredns-674b8bbfcf-cxbk5" Jan 28 01:15:43.455893 kubelet[2910]: I0128 01:15:43.455778 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/72726466-f235-4a31-a84a-a3699d8c85f7-tigera-ca-bundle\") pod \"calico-kube-controllers-79f9cd9ddf-ggx6z\" (UID: \"72726466-f235-4a31-a84a-a3699d8c85f7\") " pod="calico-system/calico-kube-controllers-79f9cd9ddf-ggx6z" Jan 28 01:15:43.455893 kubelet[2910]: I0128 01:15:43.455794 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/444bc361-893d-40c2-bec0-a4908317d6e3-whisker-backend-key-pair\") pod \"whisker-f469ddd4c-pvnqd\" (UID: \"444bc361-893d-40c2-bec0-a4908317d6e3\") " pod="calico-system/whisker-f469ddd4c-pvnqd" Jan 28 01:15:43.455893 kubelet[2910]: I0128 01:15:43.455807 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/444bc361-893d-40c2-bec0-a4908317d6e3-whisker-ca-bundle\") pod \"whisker-f469ddd4c-pvnqd\" (UID: \"444bc361-893d-40c2-bec0-a4908317d6e3\") " pod="calico-system/whisker-f469ddd4c-pvnqd" Jan 28 01:15:43.455893 kubelet[2910]: I0128 01:15:43.455820 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6jmx\" (UniqueName: \"kubernetes.io/projected/444bc361-893d-40c2-bec0-a4908317d6e3-kube-api-access-d6jmx\") pod \"whisker-f469ddd4c-pvnqd\" (UID: \"444bc361-893d-40c2-bec0-a4908317d6e3\") " pod="calico-system/whisker-f469ddd4c-pvnqd" Jan 28 01:15:43.455997 kubelet[2910]: I0128 01:15:43.455835 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqql6\" (UniqueName: \"kubernetes.io/projected/c22de3ae-0a27-443f-9dd3-c4ab0a4176bd-kube-api-access-tqql6\") pod \"goldmane-666569f655-cblpt\" (UID: \"c22de3ae-0a27-443f-9dd3-c4ab0a4176bd\") " pod="calico-system/goldmane-666569f655-cblpt" Jan 28 01:15:43.459628 systemd[1]: Created slice kubepods-besteffort-podc22de3ae_0a27_443f_9dd3_c4ab0a4176bd.slice - libcontainer container kubepods-besteffort-podc22de3ae_0a27_443f_9dd3_c4ab0a4176bd.slice. Jan 28 01:15:43.464367 systemd[1]: Created slice kubepods-besteffort-podbead6395_8434_48df_aa67_e987782da70c.slice - libcontainer container kubepods-besteffort-podbead6395_8434_48df_aa67_e987782da70c.slice. Jan 28 01:15:43.730922 containerd[1674]: time="2026-01-28T01:15:43.730843493Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-24fxv,Uid:999bf8e1-3bd4-4ef0-8db0-79618ac53f0b,Namespace:kube-system,Attempt:0,}" Jan 28 01:15:43.737318 containerd[1674]: time="2026-01-28T01:15:43.737293145Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-cxbk5,Uid:aab2583c-2dbb-4842-965d-f4f1d01197b0,Namespace:kube-system,Attempt:0,}" Jan 28 01:15:43.757960 containerd[1674]: time="2026-01-28T01:15:43.757885168Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-f469ddd4c-pvnqd,Uid:444bc361-893d-40c2-bec0-a4908317d6e3,Namespace:calico-system,Attempt:0,}" Jan 28 01:15:43.758891 containerd[1674]: time="2026-01-28T01:15:43.758832880Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-79f9cd9ddf-ggx6z,Uid:72726466-f235-4a31-a84a-a3699d8c85f7,Namespace:calico-system,Attempt:0,}" Jan 28 01:15:43.759187 containerd[1674]: time="2026-01-28T01:15:43.759161450Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-76f584f9b9-9mjk9,Uid:7ea90b44-fc7d-4702-a1a5-1c558b3ecd80,Namespace:calico-apiserver,Attempt:0,}" Jan 28 01:15:43.766895 containerd[1674]: time="2026-01-28T01:15:43.766778447Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-cblpt,Uid:c22de3ae-0a27-443f-9dd3-c4ab0a4176bd,Namespace:calico-system,Attempt:0,}" Jan 28 01:15:43.771593 containerd[1674]: time="2026-01-28T01:15:43.771137718Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-76f584f9b9-6dh2c,Uid:bead6395-8434-48df-aa67-e987782da70c,Namespace:calico-apiserver,Attempt:0,}" Jan 28 01:15:43.875808 containerd[1674]: time="2026-01-28T01:15:43.875707501Z" level=error msg="Failed to destroy network for sandbox \"4a9ea00f6934799e5ca865fc01df11e91dd324a531a2a8479b11a1a243ed38f6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 01:15:43.883361 containerd[1674]: time="2026-01-28T01:15:43.883117861Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-cxbk5,Uid:aab2583c-2dbb-4842-965d-f4f1d01197b0,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"4a9ea00f6934799e5ca865fc01df11e91dd324a531a2a8479b11a1a243ed38f6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 01:15:43.883750 containerd[1674]: time="2026-01-28T01:15:43.883730811Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\"" Jan 28 01:15:43.884084 kubelet[2910]: E0128 01:15:43.884047 2910 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4a9ea00f6934799e5ca865fc01df11e91dd324a531a2a8479b11a1a243ed38f6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 01:15:43.884834 kubelet[2910]: E0128 01:15:43.884094 2910 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4a9ea00f6934799e5ca865fc01df11e91dd324a531a2a8479b11a1a243ed38f6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-cxbk5" Jan 28 01:15:43.884834 kubelet[2910]: E0128 01:15:43.884112 2910 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4a9ea00f6934799e5ca865fc01df11e91dd324a531a2a8479b11a1a243ed38f6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-cxbk5" Jan 28 01:15:43.884834 kubelet[2910]: E0128 01:15:43.884147 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-cxbk5_kube-system(aab2583c-2dbb-4842-965d-f4f1d01197b0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-cxbk5_kube-system(aab2583c-2dbb-4842-965d-f4f1d01197b0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4a9ea00f6934799e5ca865fc01df11e91dd324a531a2a8479b11a1a243ed38f6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-cxbk5" podUID="aab2583c-2dbb-4842-965d-f4f1d01197b0" Jan 28 01:15:43.920211 containerd[1674]: time="2026-01-28T01:15:43.920110722Z" level=error msg="Failed to destroy network for sandbox \"89e6355fc848bb96b42d86dd12aa16a71ab3a93c006e96f6cdc7e428ceb01daf\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 01:15:43.929053 containerd[1674]: time="2026-01-28T01:15:43.928791906Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-24fxv,Uid:999bf8e1-3bd4-4ef0-8db0-79618ac53f0b,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"89e6355fc848bb96b42d86dd12aa16a71ab3a93c006e96f6cdc7e428ceb01daf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 01:15:43.929204 kubelet[2910]: E0128 01:15:43.929091 2910 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"89e6355fc848bb96b42d86dd12aa16a71ab3a93c006e96f6cdc7e428ceb01daf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 01:15:43.929204 kubelet[2910]: E0128 01:15:43.929161 2910 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"89e6355fc848bb96b42d86dd12aa16a71ab3a93c006e96f6cdc7e428ceb01daf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-24fxv" Jan 28 01:15:43.929583 kubelet[2910]: E0128 01:15:43.929273 2910 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"89e6355fc848bb96b42d86dd12aa16a71ab3a93c006e96f6cdc7e428ceb01daf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-24fxv" Jan 28 01:15:43.931033 kubelet[2910]: E0128 01:15:43.929704 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-24fxv_kube-system(999bf8e1-3bd4-4ef0-8db0-79618ac53f0b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-24fxv_kube-system(999bf8e1-3bd4-4ef0-8db0-79618ac53f0b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"89e6355fc848bb96b42d86dd12aa16a71ab3a93c006e96f6cdc7e428ceb01daf\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-24fxv" podUID="999bf8e1-3bd4-4ef0-8db0-79618ac53f0b" Jan 28 01:15:43.949625 containerd[1674]: time="2026-01-28T01:15:43.949466054Z" level=error msg="Failed to destroy network for sandbox \"00fac4e0a2e3f7c9f98e2561b37622d8e66d166fd78e3b2922b935965926a951\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 01:15:43.955853 containerd[1674]: time="2026-01-28T01:15:43.955799555Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-79f9cd9ddf-ggx6z,Uid:72726466-f235-4a31-a84a-a3699d8c85f7,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"00fac4e0a2e3f7c9f98e2561b37622d8e66d166fd78e3b2922b935965926a951\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 01:15:43.956203 containerd[1674]: time="2026-01-28T01:15:43.956122342Z" level=error msg="Failed to destroy network for sandbox \"b60e05676f529161a3f52839ca136c4cbca33ed5284b567e6dea635f674e577d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 01:15:43.956933 kubelet[2910]: E0128 01:15:43.956895 2910 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"00fac4e0a2e3f7c9f98e2561b37622d8e66d166fd78e3b2922b935965926a951\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 01:15:43.957032 kubelet[2910]: E0128 01:15:43.956949 2910 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"00fac4e0a2e3f7c9f98e2561b37622d8e66d166fd78e3b2922b935965926a951\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-79f9cd9ddf-ggx6z" Jan 28 01:15:43.957032 kubelet[2910]: E0128 01:15:43.956967 2910 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"00fac4e0a2e3f7c9f98e2561b37622d8e66d166fd78e3b2922b935965926a951\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-79f9cd9ddf-ggx6z" Jan 28 01:15:43.957032 kubelet[2910]: E0128 01:15:43.957019 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-79f9cd9ddf-ggx6z_calico-system(72726466-f235-4a31-a84a-a3699d8c85f7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-79f9cd9ddf-ggx6z_calico-system(72726466-f235-4a31-a84a-a3699d8c85f7)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"00fac4e0a2e3f7c9f98e2561b37622d8e66d166fd78e3b2922b935965926a951\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-79f9cd9ddf-ggx6z" podUID="72726466-f235-4a31-a84a-a3699d8c85f7" Jan 28 01:15:43.959611 containerd[1674]: time="2026-01-28T01:15:43.959583924Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-f469ddd4c-pvnqd,Uid:444bc361-893d-40c2-bec0-a4908317d6e3,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"b60e05676f529161a3f52839ca136c4cbca33ed5284b567e6dea635f674e577d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 01:15:43.962755 kubelet[2910]: E0128 01:15:43.962727 2910 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b60e05676f529161a3f52839ca136c4cbca33ed5284b567e6dea635f674e577d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 01:15:43.962831 kubelet[2910]: E0128 01:15:43.962775 2910 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b60e05676f529161a3f52839ca136c4cbca33ed5284b567e6dea635f674e577d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-f469ddd4c-pvnqd" Jan 28 01:15:43.962831 kubelet[2910]: E0128 01:15:43.962793 2910 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b60e05676f529161a3f52839ca136c4cbca33ed5284b567e6dea635f674e577d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-f469ddd4c-pvnqd" Jan 28 01:15:43.962878 kubelet[2910]: E0128 01:15:43.962832 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-f469ddd4c-pvnqd_calico-system(444bc361-893d-40c2-bec0-a4908317d6e3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-f469ddd4c-pvnqd_calico-system(444bc361-893d-40c2-bec0-a4908317d6e3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b60e05676f529161a3f52839ca136c4cbca33ed5284b567e6dea635f674e577d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-f469ddd4c-pvnqd" podUID="444bc361-893d-40c2-bec0-a4908317d6e3" Jan 28 01:15:43.972293 containerd[1674]: time="2026-01-28T01:15:43.972245839Z" level=error msg="Failed to destroy network for sandbox \"ec149d5486e7a9eee9152f914ba3e3be4d9d86cdf199d74cbe7ad81daa0ddc07\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 01:15:43.973646 containerd[1674]: time="2026-01-28T01:15:43.973532876Z" level=error msg="Failed to destroy network for sandbox \"94609e279fd3ad24a23fd00d5cb87cf7232cd1dedce4119183f80452d16ad117\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 01:15:43.978756 containerd[1674]: time="2026-01-28T01:15:43.978710653Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-76f584f9b9-9mjk9,Uid:7ea90b44-fc7d-4702-a1a5-1c558b3ecd80,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"94609e279fd3ad24a23fd00d5cb87cf7232cd1dedce4119183f80452d16ad117\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 01:15:43.978968 kubelet[2910]: E0128 01:15:43.978934 2910 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"94609e279fd3ad24a23fd00d5cb87cf7232cd1dedce4119183f80452d16ad117\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 01:15:43.979424 kubelet[2910]: E0128 01:15:43.978981 2910 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"94609e279fd3ad24a23fd00d5cb87cf7232cd1dedce4119183f80452d16ad117\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-76f584f9b9-9mjk9" Jan 28 01:15:43.979424 kubelet[2910]: E0128 01:15:43.979000 2910 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"94609e279fd3ad24a23fd00d5cb87cf7232cd1dedce4119183f80452d16ad117\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-76f584f9b9-9mjk9" Jan 28 01:15:43.979424 kubelet[2910]: E0128 01:15:43.979065 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-76f584f9b9-9mjk9_calico-apiserver(7ea90b44-fc7d-4702-a1a5-1c558b3ecd80)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-76f584f9b9-9mjk9_calico-apiserver(7ea90b44-fc7d-4702-a1a5-1c558b3ecd80)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"94609e279fd3ad24a23fd00d5cb87cf7232cd1dedce4119183f80452d16ad117\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-76f584f9b9-9mjk9" podUID="7ea90b44-fc7d-4702-a1a5-1c558b3ecd80" Jan 28 01:15:43.980556 containerd[1674]: time="2026-01-28T01:15:43.980491707Z" level=error msg="Failed to destroy network for sandbox \"7961cb981d5d42cc1109fdb5aad5ecb53cf5aa5d67bd6c4dc3c5a05d8f2ea11a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 01:15:43.982811 containerd[1674]: time="2026-01-28T01:15:43.982671628Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-76f584f9b9-6dh2c,Uid:bead6395-8434-48df-aa67-e987782da70c,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ec149d5486e7a9eee9152f914ba3e3be4d9d86cdf199d74cbe7ad81daa0ddc07\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 01:15:43.983349 kubelet[2910]: E0128 01:15:43.983315 2910 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ec149d5486e7a9eee9152f914ba3e3be4d9d86cdf199d74cbe7ad81daa0ddc07\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 01:15:43.983406 kubelet[2910]: E0128 01:15:43.983368 2910 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ec149d5486e7a9eee9152f914ba3e3be4d9d86cdf199d74cbe7ad81daa0ddc07\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-76f584f9b9-6dh2c" Jan 28 01:15:43.983406 kubelet[2910]: E0128 01:15:43.983386 2910 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ec149d5486e7a9eee9152f914ba3e3be4d9d86cdf199d74cbe7ad81daa0ddc07\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-76f584f9b9-6dh2c" Jan 28 01:15:43.983776 kubelet[2910]: E0128 01:15:43.983753 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-76f584f9b9-6dh2c_calico-apiserver(bead6395-8434-48df-aa67-e987782da70c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-76f584f9b9-6dh2c_calico-apiserver(bead6395-8434-48df-aa67-e987782da70c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ec149d5486e7a9eee9152f914ba3e3be4d9d86cdf199d74cbe7ad81daa0ddc07\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-76f584f9b9-6dh2c" podUID="bead6395-8434-48df-aa67-e987782da70c" Jan 28 01:15:43.985699 containerd[1674]: time="2026-01-28T01:15:43.985658077Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-cblpt,Uid:c22de3ae-0a27-443f-9dd3-c4ab0a4176bd,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7961cb981d5d42cc1109fdb5aad5ecb53cf5aa5d67bd6c4dc3c5a05d8f2ea11a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 01:15:43.985925 kubelet[2910]: E0128 01:15:43.985900 2910 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7961cb981d5d42cc1109fdb5aad5ecb53cf5aa5d67bd6c4dc3c5a05d8f2ea11a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 01:15:43.985985 kubelet[2910]: E0128 01:15:43.985946 2910 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7961cb981d5d42cc1109fdb5aad5ecb53cf5aa5d67bd6c4dc3c5a05d8f2ea11a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-cblpt" Jan 28 01:15:43.985985 kubelet[2910]: E0128 01:15:43.985963 2910 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7961cb981d5d42cc1109fdb5aad5ecb53cf5aa5d67bd6c4dc3c5a05d8f2ea11a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-cblpt" Jan 28 01:15:43.986090 kubelet[2910]: E0128 01:15:43.986024 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-666569f655-cblpt_calico-system(c22de3ae-0a27-443f-9dd3-c4ab0a4176bd)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-666569f655-cblpt_calico-system(c22de3ae-0a27-443f-9dd3-c4ab0a4176bd)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7961cb981d5d42cc1109fdb5aad5ecb53cf5aa5d67bd6c4dc3c5a05d8f2ea11a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-666569f655-cblpt" podUID="c22de3ae-0a27-443f-9dd3-c4ab0a4176bd" Jan 28 01:15:44.704953 systemd[1]: run-netns-cni\x2d3f94b657\x2df098\x2d058a\x2d6180\x2dad93518df4e2.mount: Deactivated successfully. Jan 28 01:15:44.705348 systemd[1]: run-netns-cni\x2d758e8589\x2dcad8\x2de88c\x2d248a\x2d17b5d93ad78c.mount: Deactivated successfully. Jan 28 01:15:44.705461 systemd[1]: run-netns-cni\x2d80027241\x2ddee7\x2d0b8a\x2d23a6\x2d6804fee14d7f.mount: Deactivated successfully. Jan 28 01:15:44.705560 systemd[1]: run-netns-cni\x2dcf690598\x2d9472\x2d7c3e\x2d8d13\x2dc72bb09e5886.mount: Deactivated successfully. Jan 28 01:15:44.757169 systemd[1]: Created slice kubepods-besteffort-pode60412d5_27c3_4569_9b64_5743c10cc437.slice - libcontainer container kubepods-besteffort-pode60412d5_27c3_4569_9b64_5743c10cc437.slice. Jan 28 01:15:44.760629 containerd[1674]: time="2026-01-28T01:15:44.760587366Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-lkr4f,Uid:e60412d5-27c3-4569-9b64-5743c10cc437,Namespace:calico-system,Attempt:0,}" Jan 28 01:15:44.822363 containerd[1674]: time="2026-01-28T01:15:44.822324337Z" level=error msg="Failed to destroy network for sandbox \"f7463a085a64b9c3bbe7efa2283066884ac22ff17f3ddb9a8b7ba4b35c9d25f7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 01:15:44.824586 systemd[1]: run-netns-cni\x2df1f27f4a\x2d79ef\x2de23e\x2d4ae4\x2dff8ecc4508e2.mount: Deactivated successfully. Jan 28 01:15:44.827129 containerd[1674]: time="2026-01-28T01:15:44.827037585Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-lkr4f,Uid:e60412d5-27c3-4569-9b64-5743c10cc437,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f7463a085a64b9c3bbe7efa2283066884ac22ff17f3ddb9a8b7ba4b35c9d25f7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 01:15:44.827268 kubelet[2910]: E0128 01:15:44.827236 2910 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f7463a085a64b9c3bbe7efa2283066884ac22ff17f3ddb9a8b7ba4b35c9d25f7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 01:15:44.827322 kubelet[2910]: E0128 01:15:44.827289 2910 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f7463a085a64b9c3bbe7efa2283066884ac22ff17f3ddb9a8b7ba4b35c9d25f7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-lkr4f" Jan 28 01:15:44.827322 kubelet[2910]: E0128 01:15:44.827314 2910 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f7463a085a64b9c3bbe7efa2283066884ac22ff17f3ddb9a8b7ba4b35c9d25f7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-lkr4f" Jan 28 01:15:44.827389 kubelet[2910]: E0128 01:15:44.827359 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-lkr4f_calico-system(e60412d5-27c3-4569-9b64-5743c10cc437)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-lkr4f_calico-system(e60412d5-27c3-4569-9b64-5743c10cc437)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f7463a085a64b9c3bbe7efa2283066884ac22ff17f3ddb9a8b7ba4b35c9d25f7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-lkr4f" podUID="e60412d5-27c3-4569-9b64-5743c10cc437" Jan 28 01:15:45.843849 kubelet[2910]: I0128 01:15:45.843288 2910 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 28 01:15:45.879000 audit[3929]: NETFILTER_CFG table=filter:117 family=2 entries=21 op=nft_register_rule pid=3929 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:15:45.882487 kernel: kauditd_printk_skb: 22 callbacks suppressed Jan 28 01:15:45.882558 kernel: audit: type=1325 audit(1769562945.879:575): table=filter:117 family=2 entries=21 op=nft_register_rule pid=3929 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:15:45.879000 audit[3929]: SYSCALL arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7fff22084110 a2=0 a3=7fff220840fc items=0 ppid=3019 pid=3929 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:45.894031 kernel: audit: type=1300 audit(1769562945.879:575): arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7fff22084110 a2=0 a3=7fff220840fc items=0 ppid=3019 pid=3929 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:45.879000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:15:45.898018 kernel: audit: type=1327 audit(1769562945.879:575): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:15:45.893000 audit[3929]: NETFILTER_CFG table=nat:118 family=2 entries=19 op=nft_register_chain pid=3929 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:15:45.901025 kernel: audit: type=1325 audit(1769562945.893:576): table=nat:118 family=2 entries=19 op=nft_register_chain pid=3929 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:15:45.893000 audit[3929]: SYSCALL arch=c000003e syscall=46 success=yes exit=6276 a0=3 a1=7fff22084110 a2=0 a3=7fff220840fc items=0 ppid=3019 pid=3929 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:45.893000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:15:45.907829 kernel: audit: type=1300 audit(1769562945.893:576): arch=c000003e syscall=46 success=yes exit=6276 a0=3 a1=7fff22084110 a2=0 a3=7fff220840fc items=0 ppid=3019 pid=3929 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:45.907892 kernel: audit: type=1327 audit(1769562945.893:576): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:15:51.477999 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount471076799.mount: Deactivated successfully. Jan 28 01:15:51.503021 containerd[1674]: time="2026-01-28T01:15:51.502924034Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:15:51.508336 containerd[1674]: time="2026-01-28T01:15:51.508122724Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.4: active requests=0, bytes read=156880025" Jan 28 01:15:51.508336 containerd[1674]: time="2026-01-28T01:15:51.508250951Z" level=info msg="ImageCreate event name:\"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:15:51.511027 containerd[1674]: time="2026-01-28T01:15:51.510885213Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:15:51.511546 containerd[1674]: time="2026-01-28T01:15:51.511262385Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.4\" with image id \"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\", size \"156883537\" in 7.627451949s" Jan 28 01:15:51.511546 containerd[1674]: time="2026-01-28T01:15:51.511296801Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\" returns image reference \"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\"" Jan 28 01:15:51.531785 containerd[1674]: time="2026-01-28T01:15:51.531738946Z" level=info msg="CreateContainer within sandbox \"f18da8e1cfe1ee3c5f541bc71c1fa4d5088e88c06dac6e43360be833102950e3\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Jan 28 01:15:51.609072 containerd[1674]: time="2026-01-28T01:15:51.608285233Z" level=info msg="Container aaca43198013bcf6b7f624e32e4611d153117e68ea4ac4ff1c5e5762cdbe444b: CDI devices from CRI Config.CDIDevices: []" Jan 28 01:15:51.611091 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount730010268.mount: Deactivated successfully. Jan 28 01:15:51.624051 containerd[1674]: time="2026-01-28T01:15:51.623981538Z" level=info msg="CreateContainer within sandbox \"f18da8e1cfe1ee3c5f541bc71c1fa4d5088e88c06dac6e43360be833102950e3\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"aaca43198013bcf6b7f624e32e4611d153117e68ea4ac4ff1c5e5762cdbe444b\"" Jan 28 01:15:51.626028 containerd[1674]: time="2026-01-28T01:15:51.624969561Z" level=info msg="StartContainer for \"aaca43198013bcf6b7f624e32e4611d153117e68ea4ac4ff1c5e5762cdbe444b\"" Jan 28 01:15:51.626624 containerd[1674]: time="2026-01-28T01:15:51.626594002Z" level=info msg="connecting to shim aaca43198013bcf6b7f624e32e4611d153117e68ea4ac4ff1c5e5762cdbe444b" address="unix:///run/containerd/s/477df8dcf35e8fc7bec38ae7b920240307c8f275fc87a791b9c4fb993462fded" protocol=ttrpc version=3 Jan 28 01:15:51.687245 systemd[1]: Started cri-containerd-aaca43198013bcf6b7f624e32e4611d153117e68ea4ac4ff1c5e5762cdbe444b.scope - libcontainer container aaca43198013bcf6b7f624e32e4611d153117e68ea4ac4ff1c5e5762cdbe444b. Jan 28 01:15:51.748000 audit: BPF prog-id=172 op=LOAD Jan 28 01:15:51.752048 kernel: audit: type=1334 audit(1769562951.748:577): prog-id=172 op=LOAD Jan 28 01:15:51.752119 kernel: audit: type=1300 audit(1769562951.748:577): arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=3449 pid=3939 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:51.748000 audit[3939]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=3449 pid=3939 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:51.748000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6161636134333139383031336263663662376636323465333265343631 Jan 28 01:15:51.758240 kernel: audit: type=1327 audit(1769562951.748:577): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6161636134333139383031336263663662376636323465333265343631 Jan 28 01:15:51.748000 audit: BPF prog-id=173 op=LOAD Jan 28 01:15:51.761749 kernel: audit: type=1334 audit(1769562951.748:578): prog-id=173 op=LOAD Jan 28 01:15:51.765111 kernel: audit: type=1300 audit(1769562951.748:578): arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=3449 pid=3939 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:51.748000 audit[3939]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=3449 pid=3939 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:51.748000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6161636134333139383031336263663662376636323465333265343631 Jan 28 01:15:51.769532 kernel: audit: type=1327 audit(1769562951.748:578): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6161636134333139383031336263663662376636323465333265343631 Jan 28 01:15:51.750000 audit: BPF prog-id=173 op=UNLOAD Jan 28 01:15:51.750000 audit[3939]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3449 pid=3939 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:51.775824 kernel: audit: type=1334 audit(1769562951.750:579): prog-id=173 op=UNLOAD Jan 28 01:15:51.775875 kernel: audit: type=1300 audit(1769562951.750:579): arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3449 pid=3939 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:51.750000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6161636134333139383031336263663662376636323465333265343631 Jan 28 01:15:51.780406 kernel: audit: type=1327 audit(1769562951.750:579): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6161636134333139383031336263663662376636323465333265343631 Jan 28 01:15:51.750000 audit: BPF prog-id=172 op=UNLOAD Jan 28 01:15:51.783588 kernel: audit: type=1334 audit(1769562951.750:580): prog-id=172 op=UNLOAD Jan 28 01:15:51.750000 audit[3939]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3449 pid=3939 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:51.750000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6161636134333139383031336263663662376636323465333265343631 Jan 28 01:15:51.750000 audit: BPF prog-id=174 op=LOAD Jan 28 01:15:51.750000 audit[3939]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=3449 pid=3939 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:51.750000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6161636134333139383031336263663662376636323465333265343631 Jan 28 01:15:51.799064 containerd[1674]: time="2026-01-28T01:15:51.799000400Z" level=info msg="StartContainer for \"aaca43198013bcf6b7f624e32e4611d153117e68ea4ac4ff1c5e5762cdbe444b\" returns successfully" Jan 28 01:15:51.892533 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Jan 28 01:15:51.892646 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Jan 28 01:15:51.949031 kubelet[2910]: I0128 01:15:51.948678 2910 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-m9j58" podStartSLOduration=1.201061009 podStartE2EDuration="18.948663555s" podCreationTimestamp="2026-01-28 01:15:33 +0000 UTC" firstStartedPulling="2026-01-28 01:15:33.7642807 +0000 UTC m=+21.120132146" lastFinishedPulling="2026-01-28 01:15:51.511883247 +0000 UTC m=+38.867734692" observedRunningTime="2026-01-28 01:15:51.947462858 +0000 UTC m=+39.303314324" watchObservedRunningTime="2026-01-28 01:15:51.948663555 +0000 UTC m=+39.304515020" Jan 28 01:15:52.119162 kubelet[2910]: I0128 01:15:52.119115 2910 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/444bc361-893d-40c2-bec0-a4908317d6e3-whisker-ca-bundle\") pod \"444bc361-893d-40c2-bec0-a4908317d6e3\" (UID: \"444bc361-893d-40c2-bec0-a4908317d6e3\") " Jan 28 01:15:52.120170 kubelet[2910]: I0128 01:15:52.120103 2910 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6jmx\" (UniqueName: \"kubernetes.io/projected/444bc361-893d-40c2-bec0-a4908317d6e3-kube-api-access-d6jmx\") pod \"444bc361-893d-40c2-bec0-a4908317d6e3\" (UID: \"444bc361-893d-40c2-bec0-a4908317d6e3\") " Jan 28 01:15:52.120170 kubelet[2910]: I0128 01:15:52.120137 2910 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/444bc361-893d-40c2-bec0-a4908317d6e3-whisker-backend-key-pair\") pod \"444bc361-893d-40c2-bec0-a4908317d6e3\" (UID: \"444bc361-893d-40c2-bec0-a4908317d6e3\") " Jan 28 01:15:52.120483 kubelet[2910]: I0128 01:15:52.120469 2910 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/444bc361-893d-40c2-bec0-a4908317d6e3-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "444bc361-893d-40c2-bec0-a4908317d6e3" (UID: "444bc361-893d-40c2-bec0-a4908317d6e3"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Jan 28 01:15:52.127202 kubelet[2910]: I0128 01:15:52.127174 2910 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/444bc361-893d-40c2-bec0-a4908317d6e3-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "444bc361-893d-40c2-bec0-a4908317d6e3" (UID: "444bc361-893d-40c2-bec0-a4908317d6e3"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Jan 28 01:15:52.128046 kubelet[2910]: I0128 01:15:52.127995 2910 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/444bc361-893d-40c2-bec0-a4908317d6e3-kube-api-access-d6jmx" (OuterVolumeSpecName: "kube-api-access-d6jmx") pod "444bc361-893d-40c2-bec0-a4908317d6e3" (UID: "444bc361-893d-40c2-bec0-a4908317d6e3"). InnerVolumeSpecName "kube-api-access-d6jmx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Jan 28 01:15:52.221734 kubelet[2910]: I0128 01:15:52.221595 2910 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/444bc361-893d-40c2-bec0-a4908317d6e3-whisker-ca-bundle\") on node \"ci-4593-0-0-n-62761e1650\" DevicePath \"\"" Jan 28 01:15:52.221734 kubelet[2910]: I0128 01:15:52.221621 2910 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d6jmx\" (UniqueName: \"kubernetes.io/projected/444bc361-893d-40c2-bec0-a4908317d6e3-kube-api-access-d6jmx\") on node \"ci-4593-0-0-n-62761e1650\" DevicePath \"\"" Jan 28 01:15:52.221734 kubelet[2910]: I0128 01:15:52.221631 2910 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/444bc361-893d-40c2-bec0-a4908317d6e3-whisker-backend-key-pair\") on node \"ci-4593-0-0-n-62761e1650\" DevicePath \"\"" Jan 28 01:15:52.481432 systemd[1]: var-lib-kubelet-pods-444bc361\x2d893d\x2d40c2\x2dbec0\x2da4908317d6e3-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dd6jmx.mount: Deactivated successfully. Jan 28 01:15:52.481514 systemd[1]: var-lib-kubelet-pods-444bc361\x2d893d\x2d40c2\x2dbec0\x2da4908317d6e3-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Jan 28 01:15:52.752785 systemd[1]: Removed slice kubepods-besteffort-pod444bc361_893d_40c2_bec0_a4908317d6e3.slice - libcontainer container kubepods-besteffort-pod444bc361_893d_40c2_bec0_a4908317d6e3.slice. Jan 28 01:15:53.004353 systemd[1]: Created slice kubepods-besteffort-pod3934179d_fb13_45a1_9643_cbd7ec08e773.slice - libcontainer container kubepods-besteffort-pod3934179d_fb13_45a1_9643_cbd7ec08e773.slice. Jan 28 01:15:53.026613 kubelet[2910]: I0128 01:15:53.026567 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3934179d-fb13-45a1-9643-cbd7ec08e773-whisker-ca-bundle\") pod \"whisker-b767f6fff-w2bfk\" (UID: \"3934179d-fb13-45a1-9643-cbd7ec08e773\") " pod="calico-system/whisker-b767f6fff-w2bfk" Jan 28 01:15:53.026613 kubelet[2910]: I0128 01:15:53.026613 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/3934179d-fb13-45a1-9643-cbd7ec08e773-whisker-backend-key-pair\") pod \"whisker-b767f6fff-w2bfk\" (UID: \"3934179d-fb13-45a1-9643-cbd7ec08e773\") " pod="calico-system/whisker-b767f6fff-w2bfk" Jan 28 01:15:53.026613 kubelet[2910]: I0128 01:15:53.026630 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4h8tc\" (UniqueName: \"kubernetes.io/projected/3934179d-fb13-45a1-9643-cbd7ec08e773-kube-api-access-4h8tc\") pod \"whisker-b767f6fff-w2bfk\" (UID: \"3934179d-fb13-45a1-9643-cbd7ec08e773\") " pod="calico-system/whisker-b767f6fff-w2bfk" Jan 28 01:15:53.310716 containerd[1674]: time="2026-01-28T01:15:53.310594930Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-b767f6fff-w2bfk,Uid:3934179d-fb13-45a1-9643-cbd7ec08e773,Namespace:calico-system,Attempt:0,}" Jan 28 01:15:53.614000 audit: BPF prog-id=175 op=LOAD Jan 28 01:15:53.614000 audit[4185]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffd8eecf970 a2=98 a3=1fffffffffffffff items=0 ppid=4076 pid=4185 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:53.614000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 28 01:15:53.614000 audit: BPF prog-id=175 op=UNLOAD Jan 28 01:15:53.614000 audit[4185]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7ffd8eecf940 a3=0 items=0 ppid=4076 pid=4185 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:53.614000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 28 01:15:53.614000 audit: BPF prog-id=176 op=LOAD Jan 28 01:15:53.614000 audit[4185]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffd8eecf850 a2=94 a3=3 items=0 ppid=4076 pid=4185 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:53.614000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 28 01:15:53.619000 audit: BPF prog-id=176 op=UNLOAD Jan 28 01:15:53.619000 audit[4185]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffd8eecf850 a2=94 a3=3 items=0 ppid=4076 pid=4185 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:53.619000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 28 01:15:53.619000 audit: BPF prog-id=177 op=LOAD Jan 28 01:15:53.619000 audit[4185]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffd8eecf890 a2=94 a3=7ffd8eecfa70 items=0 ppid=4076 pid=4185 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:53.619000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 28 01:15:53.619000 audit: BPF prog-id=177 op=UNLOAD Jan 28 01:15:53.619000 audit[4185]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffd8eecf890 a2=94 a3=7ffd8eecfa70 items=0 ppid=4076 pid=4185 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:53.619000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 28 01:15:53.623732 systemd-networkd[1562]: cali160fead5a46: Link UP Jan 28 01:15:53.623000 audit: BPF prog-id=178 op=LOAD Jan 28 01:15:53.623000 audit[4193]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffcbf6e1850 a2=98 a3=3 items=0 ppid=4076 pid=4193 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:53.623000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 28 01:15:53.623000 audit: BPF prog-id=178 op=UNLOAD Jan 28 01:15:53.623000 audit[4193]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7ffcbf6e1820 a3=0 items=0 ppid=4076 pid=4193 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:53.623000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 28 01:15:53.623000 audit: BPF prog-id=179 op=LOAD Jan 28 01:15:53.623000 audit[4193]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffcbf6e1640 a2=94 a3=54428f items=0 ppid=4076 pid=4193 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:53.623000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 28 01:15:53.623000 audit: BPF prog-id=179 op=UNLOAD Jan 28 01:15:53.623000 audit[4193]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffcbf6e1640 a2=94 a3=54428f items=0 ppid=4076 pid=4193 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:53.623000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 28 01:15:53.623000 audit: BPF prog-id=180 op=LOAD Jan 28 01:15:53.623000 audit[4193]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffcbf6e1670 a2=94 a3=2 items=0 ppid=4076 pid=4193 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:53.623000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 28 01:15:53.623000 audit: BPF prog-id=180 op=UNLOAD Jan 28 01:15:53.623000 audit[4193]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffcbf6e1670 a2=0 a3=2 items=0 ppid=4076 pid=4193 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:53.623000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 28 01:15:53.625304 systemd-networkd[1562]: cali160fead5a46: Gained carrier Jan 28 01:15:53.643021 containerd[1674]: 2026-01-28 01:15:53.368 [INFO][4124] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 28 01:15:53.643021 containerd[1674]: 2026-01-28 01:15:53.527 [INFO][4124] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4593--0--0--n--62761e1650-k8s-whisker--b767f6fff--w2bfk-eth0 whisker-b767f6fff- calico-system 3934179d-fb13-45a1-9643-cbd7ec08e773 894 0 2026-01-28 01:15:52 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:b767f6fff projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4593-0-0-n-62761e1650 whisker-b767f6fff-w2bfk eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali160fead5a46 [] [] }} ContainerID="1d4fe2c13ddd8b0890bc444e711f37da2af10481b83522378bf7722ab818eeda" Namespace="calico-system" Pod="whisker-b767f6fff-w2bfk" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-whisker--b767f6fff--w2bfk-" Jan 28 01:15:53.643021 containerd[1674]: 2026-01-28 01:15:53.527 [INFO][4124] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="1d4fe2c13ddd8b0890bc444e711f37da2af10481b83522378bf7722ab818eeda" Namespace="calico-system" Pod="whisker-b767f6fff-w2bfk" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-whisker--b767f6fff--w2bfk-eth0" Jan 28 01:15:53.643021 containerd[1674]: 2026-01-28 01:15:53.570 [INFO][4157] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1d4fe2c13ddd8b0890bc444e711f37da2af10481b83522378bf7722ab818eeda" HandleID="k8s-pod-network.1d4fe2c13ddd8b0890bc444e711f37da2af10481b83522378bf7722ab818eeda" Workload="ci--4593--0--0--n--62761e1650-k8s-whisker--b767f6fff--w2bfk-eth0" Jan 28 01:15:53.643243 containerd[1674]: 2026-01-28 01:15:53.570 [INFO][4157] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="1d4fe2c13ddd8b0890bc444e711f37da2af10481b83522378bf7722ab818eeda" HandleID="k8s-pod-network.1d4fe2c13ddd8b0890bc444e711f37da2af10481b83522378bf7722ab818eeda" Workload="ci--4593--0--0--n--62761e1650-k8s-whisker--b767f6fff--w2bfk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002ccfe0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4593-0-0-n-62761e1650", "pod":"whisker-b767f6fff-w2bfk", "timestamp":"2026-01-28 01:15:53.570232921 +0000 UTC"}, Hostname:"ci-4593-0-0-n-62761e1650", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 28 01:15:53.643243 containerd[1674]: 2026-01-28 01:15:53.570 [INFO][4157] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 28 01:15:53.643243 containerd[1674]: 2026-01-28 01:15:53.570 [INFO][4157] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 28 01:15:53.643243 containerd[1674]: 2026-01-28 01:15:53.570 [INFO][4157] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4593-0-0-n-62761e1650' Jan 28 01:15:53.643243 containerd[1674]: 2026-01-28 01:15:53.577 [INFO][4157] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.1d4fe2c13ddd8b0890bc444e711f37da2af10481b83522378bf7722ab818eeda" host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:53.643243 containerd[1674]: 2026-01-28 01:15:53.582 [INFO][4157] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:53.643243 containerd[1674]: 2026-01-28 01:15:53.586 [INFO][4157] ipam/ipam.go 511: Trying affinity for 192.168.80.192/26 host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:53.643243 containerd[1674]: 2026-01-28 01:15:53.588 [INFO][4157] ipam/ipam.go 158: Attempting to load block cidr=192.168.80.192/26 host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:53.643243 containerd[1674]: 2026-01-28 01:15:53.591 [INFO][4157] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.80.192/26 host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:53.643425 containerd[1674]: 2026-01-28 01:15:53.591 [INFO][4157] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.80.192/26 handle="k8s-pod-network.1d4fe2c13ddd8b0890bc444e711f37da2af10481b83522378bf7722ab818eeda" host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:53.643425 containerd[1674]: 2026-01-28 01:15:53.592 [INFO][4157] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.1d4fe2c13ddd8b0890bc444e711f37da2af10481b83522378bf7722ab818eeda Jan 28 01:15:53.643425 containerd[1674]: 2026-01-28 01:15:53.597 [INFO][4157] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.80.192/26 handle="k8s-pod-network.1d4fe2c13ddd8b0890bc444e711f37da2af10481b83522378bf7722ab818eeda" host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:53.643425 containerd[1674]: 2026-01-28 01:15:53.604 [INFO][4157] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.80.193/26] block=192.168.80.192/26 handle="k8s-pod-network.1d4fe2c13ddd8b0890bc444e711f37da2af10481b83522378bf7722ab818eeda" host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:53.643425 containerd[1674]: 2026-01-28 01:15:53.604 [INFO][4157] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.80.193/26] handle="k8s-pod-network.1d4fe2c13ddd8b0890bc444e711f37da2af10481b83522378bf7722ab818eeda" host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:53.643425 containerd[1674]: 2026-01-28 01:15:53.604 [INFO][4157] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 28 01:15:53.643425 containerd[1674]: 2026-01-28 01:15:53.604 [INFO][4157] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.80.193/26] IPv6=[] ContainerID="1d4fe2c13ddd8b0890bc444e711f37da2af10481b83522378bf7722ab818eeda" HandleID="k8s-pod-network.1d4fe2c13ddd8b0890bc444e711f37da2af10481b83522378bf7722ab818eeda" Workload="ci--4593--0--0--n--62761e1650-k8s-whisker--b767f6fff--w2bfk-eth0" Jan 28 01:15:53.643560 containerd[1674]: 2026-01-28 01:15:53.609 [INFO][4124] cni-plugin/k8s.go 418: Populated endpoint ContainerID="1d4fe2c13ddd8b0890bc444e711f37da2af10481b83522378bf7722ab818eeda" Namespace="calico-system" Pod="whisker-b767f6fff-w2bfk" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-whisker--b767f6fff--w2bfk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--n--62761e1650-k8s-whisker--b767f6fff--w2bfk-eth0", GenerateName:"whisker-b767f6fff-", Namespace:"calico-system", SelfLink:"", UID:"3934179d-fb13-45a1-9643-cbd7ec08e773", ResourceVersion:"894", Generation:0, CreationTimestamp:time.Date(2026, time.January, 28, 1, 15, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"b767f6fff", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-n-62761e1650", ContainerID:"", Pod:"whisker-b767f6fff-w2bfk", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.80.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali160fead5a46", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 28 01:15:53.643560 containerd[1674]: 2026-01-28 01:15:53.609 [INFO][4124] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.80.193/32] ContainerID="1d4fe2c13ddd8b0890bc444e711f37da2af10481b83522378bf7722ab818eeda" Namespace="calico-system" Pod="whisker-b767f6fff-w2bfk" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-whisker--b767f6fff--w2bfk-eth0" Jan 28 01:15:53.643665 containerd[1674]: 2026-01-28 01:15:53.609 [INFO][4124] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali160fead5a46 ContainerID="1d4fe2c13ddd8b0890bc444e711f37da2af10481b83522378bf7722ab818eeda" Namespace="calico-system" Pod="whisker-b767f6fff-w2bfk" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-whisker--b767f6fff--w2bfk-eth0" Jan 28 01:15:53.643665 containerd[1674]: 2026-01-28 01:15:53.625 [INFO][4124] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1d4fe2c13ddd8b0890bc444e711f37da2af10481b83522378bf7722ab818eeda" Namespace="calico-system" Pod="whisker-b767f6fff-w2bfk" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-whisker--b767f6fff--w2bfk-eth0" Jan 28 01:15:53.643706 containerd[1674]: 2026-01-28 01:15:53.627 [INFO][4124] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="1d4fe2c13ddd8b0890bc444e711f37da2af10481b83522378bf7722ab818eeda" Namespace="calico-system" Pod="whisker-b767f6fff-w2bfk" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-whisker--b767f6fff--w2bfk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--n--62761e1650-k8s-whisker--b767f6fff--w2bfk-eth0", GenerateName:"whisker-b767f6fff-", Namespace:"calico-system", SelfLink:"", UID:"3934179d-fb13-45a1-9643-cbd7ec08e773", ResourceVersion:"894", Generation:0, CreationTimestamp:time.Date(2026, time.January, 28, 1, 15, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"b767f6fff", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-n-62761e1650", ContainerID:"1d4fe2c13ddd8b0890bc444e711f37da2af10481b83522378bf7722ab818eeda", Pod:"whisker-b767f6fff-w2bfk", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.80.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali160fead5a46", MAC:"ea:26:41:ac:ef:c6", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 28 01:15:53.643757 containerd[1674]: 2026-01-28 01:15:53.640 [INFO][4124] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="1d4fe2c13ddd8b0890bc444e711f37da2af10481b83522378bf7722ab818eeda" Namespace="calico-system" Pod="whisker-b767f6fff-w2bfk" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-whisker--b767f6fff--w2bfk-eth0" Jan 28 01:15:53.697972 containerd[1674]: time="2026-01-28T01:15:53.697922251Z" level=info msg="connecting to shim 1d4fe2c13ddd8b0890bc444e711f37da2af10481b83522378bf7722ab818eeda" address="unix:///run/containerd/s/e5de6766fd4eb7ee5934f6faf904e6dcec096ca17f0531c234cacc924d6e8b1f" namespace=k8s.io protocol=ttrpc version=3 Jan 28 01:15:53.721281 systemd[1]: Started cri-containerd-1d4fe2c13ddd8b0890bc444e711f37da2af10481b83522378bf7722ab818eeda.scope - libcontainer container 1d4fe2c13ddd8b0890bc444e711f37da2af10481b83522378bf7722ab818eeda. Jan 28 01:15:53.735000 audit: BPF prog-id=181 op=LOAD Jan 28 01:15:53.736000 audit: BPF prog-id=182 op=LOAD Jan 28 01:15:53.736000 audit[4218]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a0238 a2=98 a3=0 items=0 ppid=4207 pid=4218 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:53.736000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3164346665326331336464643862303839306263343434653731316633 Jan 28 01:15:53.736000 audit: BPF prog-id=182 op=UNLOAD Jan 28 01:15:53.736000 audit[4218]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4207 pid=4218 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:53.736000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3164346665326331336464643862303839306263343434653731316633 Jan 28 01:15:53.736000 audit: BPF prog-id=183 op=LOAD Jan 28 01:15:53.736000 audit[4218]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a0488 a2=98 a3=0 items=0 ppid=4207 pid=4218 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:53.736000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3164346665326331336464643862303839306263343434653731316633 Jan 28 01:15:53.736000 audit: BPF prog-id=184 op=LOAD Jan 28 01:15:53.736000 audit[4218]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c0001a0218 a2=98 a3=0 items=0 ppid=4207 pid=4218 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:53.736000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3164346665326331336464643862303839306263343434653731316633 Jan 28 01:15:53.736000 audit: BPF prog-id=184 op=UNLOAD Jan 28 01:15:53.736000 audit[4218]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4207 pid=4218 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:53.736000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3164346665326331336464643862303839306263343434653731316633 Jan 28 01:15:53.736000 audit: BPF prog-id=183 op=UNLOAD Jan 28 01:15:53.736000 audit[4218]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4207 pid=4218 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:53.736000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3164346665326331336464643862303839306263343434653731316633 Jan 28 01:15:53.736000 audit: BPF prog-id=185 op=LOAD Jan 28 01:15:53.736000 audit[4218]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a06e8 a2=98 a3=0 items=0 ppid=4207 pid=4218 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:53.736000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3164346665326331336464643862303839306263343434653731316633 Jan 28 01:15:53.778595 containerd[1674]: time="2026-01-28T01:15:53.778566197Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-b767f6fff-w2bfk,Uid:3934179d-fb13-45a1-9643-cbd7ec08e773,Namespace:calico-system,Attempt:0,} returns sandbox id \"1d4fe2c13ddd8b0890bc444e711f37da2af10481b83522378bf7722ab818eeda\"" Jan 28 01:15:53.780689 containerd[1674]: time="2026-01-28T01:15:53.780580491Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 28 01:15:53.823000 audit: BPF prog-id=186 op=LOAD Jan 28 01:15:53.823000 audit[4193]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffcbf6e1530 a2=94 a3=1 items=0 ppid=4076 pid=4193 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:53.823000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 28 01:15:53.823000 audit: BPF prog-id=186 op=UNLOAD Jan 28 01:15:53.823000 audit[4193]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffcbf6e1530 a2=94 a3=1 items=0 ppid=4076 pid=4193 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:53.823000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 28 01:15:53.834000 audit: BPF prog-id=187 op=LOAD Jan 28 01:15:53.834000 audit[4193]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffcbf6e1520 a2=94 a3=4 items=0 ppid=4076 pid=4193 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:53.834000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 28 01:15:53.834000 audit: BPF prog-id=187 op=UNLOAD Jan 28 01:15:53.834000 audit[4193]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7ffcbf6e1520 a2=0 a3=4 items=0 ppid=4076 pid=4193 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:53.834000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 28 01:15:53.834000 audit: BPF prog-id=188 op=LOAD Jan 28 01:15:53.834000 audit[4193]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffcbf6e1380 a2=94 a3=5 items=0 ppid=4076 pid=4193 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:53.834000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 28 01:15:53.834000 audit: BPF prog-id=188 op=UNLOAD Jan 28 01:15:53.834000 audit[4193]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7ffcbf6e1380 a2=0 a3=5 items=0 ppid=4076 pid=4193 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:53.834000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 28 01:15:53.834000 audit: BPF prog-id=189 op=LOAD Jan 28 01:15:53.834000 audit[4193]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffcbf6e15a0 a2=94 a3=6 items=0 ppid=4076 pid=4193 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:53.834000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 28 01:15:53.834000 audit: BPF prog-id=189 op=UNLOAD Jan 28 01:15:53.834000 audit[4193]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7ffcbf6e15a0 a2=0 a3=6 items=0 ppid=4076 pid=4193 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:53.834000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 28 01:15:53.834000 audit: BPF prog-id=190 op=LOAD Jan 28 01:15:53.834000 audit[4193]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffcbf6e0d50 a2=94 a3=88 items=0 ppid=4076 pid=4193 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:53.834000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 28 01:15:53.835000 audit: BPF prog-id=191 op=LOAD Jan 28 01:15:53.835000 audit[4193]: SYSCALL arch=c000003e syscall=321 success=yes exit=7 a0=5 a1=7ffcbf6e0bd0 a2=94 a3=2 items=0 ppid=4076 pid=4193 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:53.835000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 28 01:15:53.836000 audit: BPF prog-id=191 op=UNLOAD Jan 28 01:15:53.836000 audit[4193]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=7 a1=7ffcbf6e0c00 a2=0 a3=7ffcbf6e0d00 items=0 ppid=4076 pid=4193 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:53.836000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 28 01:15:53.836000 audit: BPF prog-id=190 op=UNLOAD Jan 28 01:15:53.836000 audit[4193]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=3f855d10 a2=0 a3=44a7c637cd88bc41 items=0 ppid=4076 pid=4193 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:53.836000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 28 01:15:53.844000 audit: BPF prog-id=192 op=LOAD Jan 28 01:15:53.844000 audit[4245]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffda9e33cf0 a2=98 a3=1999999999999999 items=0 ppid=4076 pid=4245 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:53.844000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 28 01:15:53.844000 audit: BPF prog-id=192 op=UNLOAD Jan 28 01:15:53.844000 audit[4245]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7ffda9e33cc0 a3=0 items=0 ppid=4076 pid=4245 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:53.844000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 28 01:15:53.844000 audit: BPF prog-id=193 op=LOAD Jan 28 01:15:53.844000 audit[4245]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffda9e33bd0 a2=94 a3=ffff items=0 ppid=4076 pid=4245 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:53.844000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 28 01:15:53.844000 audit: BPF prog-id=193 op=UNLOAD Jan 28 01:15:53.844000 audit[4245]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffda9e33bd0 a2=94 a3=ffff items=0 ppid=4076 pid=4245 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:53.844000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 28 01:15:53.844000 audit: BPF prog-id=194 op=LOAD Jan 28 01:15:53.844000 audit[4245]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffda9e33c10 a2=94 a3=7ffda9e33df0 items=0 ppid=4076 pid=4245 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:53.844000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 28 01:15:53.844000 audit: BPF prog-id=194 op=UNLOAD Jan 28 01:15:53.844000 audit[4245]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffda9e33c10 a2=94 a3=7ffda9e33df0 items=0 ppid=4076 pid=4245 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:53.844000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 28 01:15:53.914673 systemd-networkd[1562]: vxlan.calico: Link UP Jan 28 01:15:53.914680 systemd-networkd[1562]: vxlan.calico: Gained carrier Jan 28 01:15:53.918000 audit: BPF prog-id=195 op=LOAD Jan 28 01:15:53.918000 audit[4270]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffd30034b50 a2=98 a3=0 items=0 ppid=4076 pid=4270 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:53.918000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 28 01:15:53.919000 audit: BPF prog-id=195 op=UNLOAD Jan 28 01:15:53.919000 audit[4270]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7ffd30034b20 a3=0 items=0 ppid=4076 pid=4270 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:53.919000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 28 01:15:53.919000 audit: BPF prog-id=196 op=LOAD Jan 28 01:15:53.919000 audit[4270]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffd30034960 a2=94 a3=54428f items=0 ppid=4076 pid=4270 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:53.919000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 28 01:15:53.920000 audit: BPF prog-id=196 op=UNLOAD Jan 28 01:15:53.920000 audit[4270]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffd30034960 a2=94 a3=54428f items=0 ppid=4076 pid=4270 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:53.920000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 28 01:15:53.920000 audit: BPF prog-id=197 op=LOAD Jan 28 01:15:53.920000 audit[4270]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffd30034990 a2=94 a3=2 items=0 ppid=4076 pid=4270 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:53.920000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 28 01:15:53.920000 audit: BPF prog-id=197 op=UNLOAD Jan 28 01:15:53.920000 audit[4270]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffd30034990 a2=0 a3=2 items=0 ppid=4076 pid=4270 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:53.920000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 28 01:15:53.920000 audit: BPF prog-id=198 op=LOAD Jan 28 01:15:53.920000 audit[4270]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffd30034740 a2=94 a3=4 items=0 ppid=4076 pid=4270 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:53.920000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 28 01:15:53.920000 audit: BPF prog-id=198 op=UNLOAD Jan 28 01:15:53.920000 audit[4270]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7ffd30034740 a2=94 a3=4 items=0 ppid=4076 pid=4270 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:53.920000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 28 01:15:53.920000 audit: BPF prog-id=199 op=LOAD Jan 28 01:15:53.920000 audit[4270]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffd30034840 a2=94 a3=7ffd300349c0 items=0 ppid=4076 pid=4270 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:53.920000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 28 01:15:53.921000 audit: BPF prog-id=199 op=UNLOAD Jan 28 01:15:53.921000 audit[4270]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7ffd30034840 a2=0 a3=7ffd300349c0 items=0 ppid=4076 pid=4270 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:53.921000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 28 01:15:53.923000 audit: BPF prog-id=200 op=LOAD Jan 28 01:15:53.923000 audit[4270]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffd30033f70 a2=94 a3=2 items=0 ppid=4076 pid=4270 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:53.923000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 28 01:15:53.923000 audit: BPF prog-id=200 op=UNLOAD Jan 28 01:15:53.923000 audit[4270]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7ffd30033f70 a2=0 a3=2 items=0 ppid=4076 pid=4270 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:53.923000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 28 01:15:53.923000 audit: BPF prog-id=201 op=LOAD Jan 28 01:15:53.923000 audit[4270]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffd30034070 a2=94 a3=30 items=0 ppid=4076 pid=4270 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:53.923000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 28 01:15:53.932000 audit: BPF prog-id=202 op=LOAD Jan 28 01:15:53.932000 audit[4273]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffe9ff36600 a2=98 a3=0 items=0 ppid=4076 pid=4273 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:53.932000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 28 01:15:53.932000 audit: BPF prog-id=202 op=UNLOAD Jan 28 01:15:53.932000 audit[4273]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7ffe9ff365d0 a3=0 items=0 ppid=4076 pid=4273 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:53.932000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 28 01:15:53.933000 audit: BPF prog-id=203 op=LOAD Jan 28 01:15:53.933000 audit[4273]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffe9ff363f0 a2=94 a3=54428f items=0 ppid=4076 pid=4273 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:53.933000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 28 01:15:53.933000 audit: BPF prog-id=203 op=UNLOAD Jan 28 01:15:53.933000 audit[4273]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffe9ff363f0 a2=94 a3=54428f items=0 ppid=4076 pid=4273 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:53.933000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 28 01:15:53.933000 audit: BPF prog-id=204 op=LOAD Jan 28 01:15:53.933000 audit[4273]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffe9ff36420 a2=94 a3=2 items=0 ppid=4076 pid=4273 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:53.933000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 28 01:15:53.933000 audit: BPF prog-id=204 op=UNLOAD Jan 28 01:15:53.933000 audit[4273]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffe9ff36420 a2=0 a3=2 items=0 ppid=4076 pid=4273 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:53.933000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 28 01:15:54.101000 audit: BPF prog-id=205 op=LOAD Jan 28 01:15:54.101000 audit[4273]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffe9ff362e0 a2=94 a3=1 items=0 ppid=4076 pid=4273 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:54.101000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 28 01:15:54.101000 audit: BPF prog-id=205 op=UNLOAD Jan 28 01:15:54.101000 audit[4273]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffe9ff362e0 a2=94 a3=1 items=0 ppid=4076 pid=4273 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:54.101000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 28 01:15:54.111000 audit: BPF prog-id=206 op=LOAD Jan 28 01:15:54.111000 audit[4273]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffe9ff362d0 a2=94 a3=4 items=0 ppid=4076 pid=4273 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:54.111000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 28 01:15:54.112000 audit: BPF prog-id=206 op=UNLOAD Jan 28 01:15:54.112000 audit[4273]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7ffe9ff362d0 a2=0 a3=4 items=0 ppid=4076 pid=4273 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:54.112000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 28 01:15:54.112000 audit: BPF prog-id=207 op=LOAD Jan 28 01:15:54.112000 audit[4273]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffe9ff36130 a2=94 a3=5 items=0 ppid=4076 pid=4273 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:54.112000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 28 01:15:54.112000 audit: BPF prog-id=207 op=UNLOAD Jan 28 01:15:54.112000 audit[4273]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7ffe9ff36130 a2=0 a3=5 items=0 ppid=4076 pid=4273 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:54.112000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 28 01:15:54.112000 audit: BPF prog-id=208 op=LOAD Jan 28 01:15:54.112000 audit[4273]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffe9ff36350 a2=94 a3=6 items=0 ppid=4076 pid=4273 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:54.112000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 28 01:15:54.112000 audit: BPF prog-id=208 op=UNLOAD Jan 28 01:15:54.112000 audit[4273]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7ffe9ff36350 a2=0 a3=6 items=0 ppid=4076 pid=4273 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:54.112000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 28 01:15:54.112000 audit: BPF prog-id=209 op=LOAD Jan 28 01:15:54.112000 audit[4273]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffe9ff35b00 a2=94 a3=88 items=0 ppid=4076 pid=4273 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:54.112000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 28 01:15:54.112000 audit: BPF prog-id=210 op=LOAD Jan 28 01:15:54.112000 audit[4273]: SYSCALL arch=c000003e syscall=321 success=yes exit=7 a0=5 a1=7ffe9ff35980 a2=94 a3=2 items=0 ppid=4076 pid=4273 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:54.112000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 28 01:15:54.112000 audit: BPF prog-id=210 op=UNLOAD Jan 28 01:15:54.112000 audit[4273]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=7 a1=7ffe9ff359b0 a2=0 a3=7ffe9ff35ab0 items=0 ppid=4076 pid=4273 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:54.112000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 28 01:15:54.113000 audit: BPF prog-id=209 op=UNLOAD Jan 28 01:15:54.113000 audit[4273]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=9ba8d10 a2=0 a3=2733283f66594d8b items=0 ppid=4076 pid=4273 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:54.113000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 28 01:15:54.118165 containerd[1674]: time="2026-01-28T01:15:54.117984512Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:15:54.119746 containerd[1674]: time="2026-01-28T01:15:54.119713788Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 28 01:15:54.119865 containerd[1674]: time="2026-01-28T01:15:54.119771165Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 28 01:15:54.120177 kubelet[2910]: E0128 01:15:54.120140 2910 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 28 01:15:54.120498 kubelet[2910]: E0128 01:15:54.120192 2910 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 28 01:15:54.120000 audit: BPF prog-id=201 op=UNLOAD Jan 28 01:15:54.120000 audit[4076]: SYSCALL arch=c000003e syscall=263 success=yes exit=0 a0=ffffffffffffff9c a1=c000ebc800 a2=0 a3=0 items=0 ppid=4060 pid=4076 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="calico-node" exe="/usr/bin/calico-node" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:54.120000 audit: PROCTITLE proctitle=63616C69636F2D6E6F6465002D66656C6978 Jan 28 01:15:54.127617 kubelet[2910]: E0128 01:15:54.127423 2910 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:8d1c1aed77714da78e5314e3a5e614ef,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4h8tc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-b767f6fff-w2bfk_calico-system(3934179d-fb13-45a1-9643-cbd7ec08e773): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 28 01:15:54.129710 containerd[1674]: time="2026-01-28T01:15:54.129677000Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 28 01:15:54.191000 audit[4296]: NETFILTER_CFG table=nat:119 family=2 entries=15 op=nft_register_chain pid=4296 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 28 01:15:54.191000 audit[4296]: SYSCALL arch=c000003e syscall=46 success=yes exit=5084 a0=3 a1=7ffd16e6ec00 a2=0 a3=7ffd16e6ebec items=0 ppid=4076 pid=4296 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:54.191000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 28 01:15:54.198000 audit[4300]: NETFILTER_CFG table=mangle:120 family=2 entries=16 op=nft_register_chain pid=4300 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 28 01:15:54.198000 audit[4300]: SYSCALL arch=c000003e syscall=46 success=yes exit=6868 a0=3 a1=7ffde6850a60 a2=0 a3=7ffde6850a4c items=0 ppid=4076 pid=4300 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:54.198000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 28 01:15:54.209000 audit[4297]: NETFILTER_CFG table=raw:121 family=2 entries=21 op=nft_register_chain pid=4297 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 28 01:15:54.209000 audit[4297]: SYSCALL arch=c000003e syscall=46 success=yes exit=8452 a0=3 a1=7ffebcda5a80 a2=0 a3=7ffebcda5a6c items=0 ppid=4076 pid=4297 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:54.209000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 28 01:15:54.210000 audit[4298]: NETFILTER_CFG table=filter:122 family=2 entries=94 op=nft_register_chain pid=4298 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 28 01:15:54.210000 audit[4298]: SYSCALL arch=c000003e syscall=46 success=yes exit=53116 a0=3 a1=7ffc9374da80 a2=0 a3=55e7c373e000 items=0 ppid=4076 pid=4298 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:54.210000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 28 01:15:54.467985 containerd[1674]: time="2026-01-28T01:15:54.467868930Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:15:54.469716 containerd[1674]: time="2026-01-28T01:15:54.469667571Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 28 01:15:54.469790 containerd[1674]: time="2026-01-28T01:15:54.469750558Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 28 01:15:54.470220 kubelet[2910]: E0128 01:15:54.469961 2910 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 28 01:15:54.470220 kubelet[2910]: E0128 01:15:54.470024 2910 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 28 01:15:54.470321 kubelet[2910]: E0128 01:15:54.470163 2910 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4h8tc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-b767f6fff-w2bfk_calico-system(3934179d-fb13-45a1-9643-cbd7ec08e773): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 28 01:15:54.471713 kubelet[2910]: E0128 01:15:54.471674 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-b767f6fff-w2bfk" podUID="3934179d-fb13-45a1-9643-cbd7ec08e773" Jan 28 01:15:54.747173 containerd[1674]: time="2026-01-28T01:15:54.747031104Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-76f584f9b9-9mjk9,Uid:7ea90b44-fc7d-4702-a1a5-1c558b3ecd80,Namespace:calico-apiserver,Attempt:0,}" Jan 28 01:15:54.748171 containerd[1674]: time="2026-01-28T01:15:54.748071448Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-76f584f9b9-6dh2c,Uid:bead6395-8434-48df-aa67-e987782da70c,Namespace:calico-apiserver,Attempt:0,}" Jan 28 01:15:54.748171 containerd[1674]: time="2026-01-28T01:15:54.748144279Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-cblpt,Uid:c22de3ae-0a27-443f-9dd3-c4ab0a4176bd,Namespace:calico-system,Attempt:0,}" Jan 28 01:15:54.749384 containerd[1674]: time="2026-01-28T01:15:54.749126818Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-cxbk5,Uid:aab2583c-2dbb-4842-965d-f4f1d01197b0,Namespace:kube-system,Attempt:0,}" Jan 28 01:15:54.750270 kubelet[2910]: I0128 01:15:54.750062 2910 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="444bc361-893d-40c2-bec0-a4908317d6e3" path="/var/lib/kubelet/pods/444bc361-893d-40c2-bec0-a4908317d6e3/volumes" Jan 28 01:15:54.928586 kubelet[2910]: E0128 01:15:54.928526 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-b767f6fff-w2bfk" podUID="3934179d-fb13-45a1-9643-cbd7ec08e773" Jan 28 01:15:54.968568 systemd-networkd[1562]: calid0363403551: Link UP Jan 28 01:15:54.970961 systemd-networkd[1562]: calid0363403551: Gained carrier Jan 28 01:15:55.000760 containerd[1674]: 2026-01-28 01:15:54.860 [INFO][4315] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4593--0--0--n--62761e1650-k8s-coredns--674b8bbfcf--cxbk5-eth0 coredns-674b8bbfcf- kube-system aab2583c-2dbb-4842-965d-f4f1d01197b0 818 0 2026-01-28 01:15:18 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4593-0-0-n-62761e1650 coredns-674b8bbfcf-cxbk5 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calid0363403551 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="6a1da97b8e1e4f5474a4689581bd32bb42d42c01c7c04329ca84152d713ea257" Namespace="kube-system" Pod="coredns-674b8bbfcf-cxbk5" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-coredns--674b8bbfcf--cxbk5-" Jan 28 01:15:55.000760 containerd[1674]: 2026-01-28 01:15:54.860 [INFO][4315] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="6a1da97b8e1e4f5474a4689581bd32bb42d42c01c7c04329ca84152d713ea257" Namespace="kube-system" Pod="coredns-674b8bbfcf-cxbk5" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-coredns--674b8bbfcf--cxbk5-eth0" Jan 28 01:15:55.000760 containerd[1674]: 2026-01-28 01:15:54.900 [INFO][4357] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6a1da97b8e1e4f5474a4689581bd32bb42d42c01c7c04329ca84152d713ea257" HandleID="k8s-pod-network.6a1da97b8e1e4f5474a4689581bd32bb42d42c01c7c04329ca84152d713ea257" Workload="ci--4593--0--0--n--62761e1650-k8s-coredns--674b8bbfcf--cxbk5-eth0" Jan 28 01:15:55.000973 containerd[1674]: 2026-01-28 01:15:54.900 [INFO][4357] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="6a1da97b8e1e4f5474a4689581bd32bb42d42c01c7c04329ca84152d713ea257" HandleID="k8s-pod-network.6a1da97b8e1e4f5474a4689581bd32bb42d42c01c7c04329ca84152d713ea257" Workload="ci--4593--0--0--n--62761e1650-k8s-coredns--674b8bbfcf--cxbk5-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5150), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4593-0-0-n-62761e1650", "pod":"coredns-674b8bbfcf-cxbk5", "timestamp":"2026-01-28 01:15:54.900695861 +0000 UTC"}, Hostname:"ci-4593-0-0-n-62761e1650", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 28 01:15:55.000973 containerd[1674]: 2026-01-28 01:15:54.901 [INFO][4357] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 28 01:15:55.000973 containerd[1674]: 2026-01-28 01:15:54.901 [INFO][4357] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 28 01:15:55.000973 containerd[1674]: 2026-01-28 01:15:54.901 [INFO][4357] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4593-0-0-n-62761e1650' Jan 28 01:15:55.000973 containerd[1674]: 2026-01-28 01:15:54.912 [INFO][4357] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.6a1da97b8e1e4f5474a4689581bd32bb42d42c01c7c04329ca84152d713ea257" host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:55.000973 containerd[1674]: 2026-01-28 01:15:54.921 [INFO][4357] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:55.000973 containerd[1674]: 2026-01-28 01:15:54.927 [INFO][4357] ipam/ipam.go 511: Trying affinity for 192.168.80.192/26 host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:55.000973 containerd[1674]: 2026-01-28 01:15:54.932 [INFO][4357] ipam/ipam.go 158: Attempting to load block cidr=192.168.80.192/26 host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:55.000973 containerd[1674]: 2026-01-28 01:15:54.937 [INFO][4357] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.80.192/26 host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:55.001958 containerd[1674]: 2026-01-28 01:15:54.937 [INFO][4357] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.80.192/26 handle="k8s-pod-network.6a1da97b8e1e4f5474a4689581bd32bb42d42c01c7c04329ca84152d713ea257" host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:55.001958 containerd[1674]: 2026-01-28 01:15:54.946 [INFO][4357] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.6a1da97b8e1e4f5474a4689581bd32bb42d42c01c7c04329ca84152d713ea257 Jan 28 01:15:55.001958 containerd[1674]: 2026-01-28 01:15:54.954 [INFO][4357] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.80.192/26 handle="k8s-pod-network.6a1da97b8e1e4f5474a4689581bd32bb42d42c01c7c04329ca84152d713ea257" host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:55.001958 containerd[1674]: 2026-01-28 01:15:54.960 [INFO][4357] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.80.194/26] block=192.168.80.192/26 handle="k8s-pod-network.6a1da97b8e1e4f5474a4689581bd32bb42d42c01c7c04329ca84152d713ea257" host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:55.001958 containerd[1674]: 2026-01-28 01:15:54.960 [INFO][4357] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.80.194/26] handle="k8s-pod-network.6a1da97b8e1e4f5474a4689581bd32bb42d42c01c7c04329ca84152d713ea257" host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:55.001958 containerd[1674]: 2026-01-28 01:15:54.960 [INFO][4357] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 28 01:15:55.001958 containerd[1674]: 2026-01-28 01:15:54.960 [INFO][4357] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.80.194/26] IPv6=[] ContainerID="6a1da97b8e1e4f5474a4689581bd32bb42d42c01c7c04329ca84152d713ea257" HandleID="k8s-pod-network.6a1da97b8e1e4f5474a4689581bd32bb42d42c01c7c04329ca84152d713ea257" Workload="ci--4593--0--0--n--62761e1650-k8s-coredns--674b8bbfcf--cxbk5-eth0" Jan 28 01:15:55.002198 containerd[1674]: 2026-01-28 01:15:54.965 [INFO][4315] cni-plugin/k8s.go 418: Populated endpoint ContainerID="6a1da97b8e1e4f5474a4689581bd32bb42d42c01c7c04329ca84152d713ea257" Namespace="kube-system" Pod="coredns-674b8bbfcf-cxbk5" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-coredns--674b8bbfcf--cxbk5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--n--62761e1650-k8s-coredns--674b8bbfcf--cxbk5-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"aab2583c-2dbb-4842-965d-f4f1d01197b0", ResourceVersion:"818", Generation:0, CreationTimestamp:time.Date(2026, time.January, 28, 1, 15, 18, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-n-62761e1650", ContainerID:"", Pod:"coredns-674b8bbfcf-cxbk5", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.80.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calid0363403551", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 28 01:15:55.002198 containerd[1674]: 2026-01-28 01:15:54.965 [INFO][4315] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.80.194/32] ContainerID="6a1da97b8e1e4f5474a4689581bd32bb42d42c01c7c04329ca84152d713ea257" Namespace="kube-system" Pod="coredns-674b8bbfcf-cxbk5" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-coredns--674b8bbfcf--cxbk5-eth0" Jan 28 01:15:55.002198 containerd[1674]: 2026-01-28 01:15:54.965 [INFO][4315] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid0363403551 ContainerID="6a1da97b8e1e4f5474a4689581bd32bb42d42c01c7c04329ca84152d713ea257" Namespace="kube-system" Pod="coredns-674b8bbfcf-cxbk5" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-coredns--674b8bbfcf--cxbk5-eth0" Jan 28 01:15:55.002198 containerd[1674]: 2026-01-28 01:15:54.970 [INFO][4315] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6a1da97b8e1e4f5474a4689581bd32bb42d42c01c7c04329ca84152d713ea257" Namespace="kube-system" Pod="coredns-674b8bbfcf-cxbk5" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-coredns--674b8bbfcf--cxbk5-eth0" Jan 28 01:15:55.002198 containerd[1674]: 2026-01-28 01:15:54.971 [INFO][4315] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="6a1da97b8e1e4f5474a4689581bd32bb42d42c01c7c04329ca84152d713ea257" Namespace="kube-system" Pod="coredns-674b8bbfcf-cxbk5" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-coredns--674b8bbfcf--cxbk5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--n--62761e1650-k8s-coredns--674b8bbfcf--cxbk5-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"aab2583c-2dbb-4842-965d-f4f1d01197b0", ResourceVersion:"818", Generation:0, CreationTimestamp:time.Date(2026, time.January, 28, 1, 15, 18, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-n-62761e1650", ContainerID:"6a1da97b8e1e4f5474a4689581bd32bb42d42c01c7c04329ca84152d713ea257", Pod:"coredns-674b8bbfcf-cxbk5", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.80.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calid0363403551", MAC:"9e:8d:ea:9d:6d:32", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 28 01:15:55.002198 containerd[1674]: 2026-01-28 01:15:54.995 [INFO][4315] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="6a1da97b8e1e4f5474a4689581bd32bb42d42c01c7c04329ca84152d713ea257" Namespace="kube-system" Pod="coredns-674b8bbfcf-cxbk5" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-coredns--674b8bbfcf--cxbk5-eth0" Jan 28 01:15:55.008000 audit[4397]: NETFILTER_CFG table=filter:123 family=2 entries=20 op=nft_register_rule pid=4397 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:15:55.008000 audit[4397]: SYSCALL arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7ffcd762cf80 a2=0 a3=7ffcd762cf6c items=0 ppid=3019 pid=4397 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:55.008000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:15:55.011000 audit[4397]: NETFILTER_CFG table=nat:124 family=2 entries=14 op=nft_register_rule pid=4397 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:15:55.011000 audit[4397]: SYSCALL arch=c000003e syscall=46 success=yes exit=3468 a0=3 a1=7ffcd762cf80 a2=0 a3=0 items=0 ppid=3019 pid=4397 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:55.011000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:15:55.022000 audit[4400]: NETFILTER_CFG table=filter:125 family=2 entries=42 op=nft_register_chain pid=4400 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 28 01:15:55.022000 audit[4400]: SYSCALL arch=c000003e syscall=46 success=yes exit=22552 a0=3 a1=7ffde3dbb230 a2=0 a3=7ffde3dbb21c items=0 ppid=4076 pid=4400 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:55.022000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 28 01:15:55.051516 containerd[1674]: time="2026-01-28T01:15:55.051216796Z" level=info msg="connecting to shim 6a1da97b8e1e4f5474a4689581bd32bb42d42c01c7c04329ca84152d713ea257" address="unix:///run/containerd/s/12c96e6742a1832a715af7c203ed64e03fd240d1cd677385fae8d669bf27553e" namespace=k8s.io protocol=ttrpc version=3 Jan 28 01:15:55.061147 systemd-networkd[1562]: cali0667187feaa: Link UP Jan 28 01:15:55.062318 systemd-networkd[1562]: cali0667187feaa: Gained carrier Jan 28 01:15:55.078923 containerd[1674]: 2026-01-28 01:15:54.890 [INFO][4311] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4593--0--0--n--62761e1650-k8s-calico--apiserver--76f584f9b9--9mjk9-eth0 calico-apiserver-76f584f9b9- calico-apiserver 7ea90b44-fc7d-4702-a1a5-1c558b3ecd80 819 0 2026-01-28 01:15:27 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:76f584f9b9 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4593-0-0-n-62761e1650 calico-apiserver-76f584f9b9-9mjk9 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali0667187feaa [] [] }} ContainerID="3823acaad15e705da1174331f2a1388f83c5a0a1ddb5ebd5cd4b1e960ffafafb" Namespace="calico-apiserver" Pod="calico-apiserver-76f584f9b9-9mjk9" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-calico--apiserver--76f584f9b9--9mjk9-" Jan 28 01:15:55.078923 containerd[1674]: 2026-01-28 01:15:54.890 [INFO][4311] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="3823acaad15e705da1174331f2a1388f83c5a0a1ddb5ebd5cd4b1e960ffafafb" Namespace="calico-apiserver" Pod="calico-apiserver-76f584f9b9-9mjk9" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-calico--apiserver--76f584f9b9--9mjk9-eth0" Jan 28 01:15:55.078923 containerd[1674]: 2026-01-28 01:15:54.950 [INFO][4367] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3823acaad15e705da1174331f2a1388f83c5a0a1ddb5ebd5cd4b1e960ffafafb" HandleID="k8s-pod-network.3823acaad15e705da1174331f2a1388f83c5a0a1ddb5ebd5cd4b1e960ffafafb" Workload="ci--4593--0--0--n--62761e1650-k8s-calico--apiserver--76f584f9b9--9mjk9-eth0" Jan 28 01:15:55.078923 containerd[1674]: 2026-01-28 01:15:54.953 [INFO][4367] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="3823acaad15e705da1174331f2a1388f83c5a0a1ddb5ebd5cd4b1e960ffafafb" HandleID="k8s-pod-network.3823acaad15e705da1174331f2a1388f83c5a0a1ddb5ebd5cd4b1e960ffafafb" Workload="ci--4593--0--0--n--62761e1650-k8s-calico--apiserver--76f584f9b9--9mjk9-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5cc0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4593-0-0-n-62761e1650", "pod":"calico-apiserver-76f584f9b9-9mjk9", "timestamp":"2026-01-28 01:15:54.950875393 +0000 UTC"}, Hostname:"ci-4593-0-0-n-62761e1650", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 28 01:15:55.078923 containerd[1674]: 2026-01-28 01:15:54.953 [INFO][4367] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 28 01:15:55.078923 containerd[1674]: 2026-01-28 01:15:54.962 [INFO][4367] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 28 01:15:55.078923 containerd[1674]: 2026-01-28 01:15:54.962 [INFO][4367] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4593-0-0-n-62761e1650' Jan 28 01:15:55.078923 containerd[1674]: 2026-01-28 01:15:55.013 [INFO][4367] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.3823acaad15e705da1174331f2a1388f83c5a0a1ddb5ebd5cd4b1e960ffafafb" host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:55.078923 containerd[1674]: 2026-01-28 01:15:55.018 [INFO][4367] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:55.078923 containerd[1674]: 2026-01-28 01:15:55.030 [INFO][4367] ipam/ipam.go 511: Trying affinity for 192.168.80.192/26 host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:55.078923 containerd[1674]: 2026-01-28 01:15:55.034 [INFO][4367] ipam/ipam.go 158: Attempting to load block cidr=192.168.80.192/26 host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:55.078923 containerd[1674]: 2026-01-28 01:15:55.039 [INFO][4367] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.80.192/26 host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:55.078923 containerd[1674]: 2026-01-28 01:15:55.039 [INFO][4367] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.80.192/26 handle="k8s-pod-network.3823acaad15e705da1174331f2a1388f83c5a0a1ddb5ebd5cd4b1e960ffafafb" host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:55.078923 containerd[1674]: 2026-01-28 01:15:55.042 [INFO][4367] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.3823acaad15e705da1174331f2a1388f83c5a0a1ddb5ebd5cd4b1e960ffafafb Jan 28 01:15:55.078923 containerd[1674]: 2026-01-28 01:15:55.047 [INFO][4367] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.80.192/26 handle="k8s-pod-network.3823acaad15e705da1174331f2a1388f83c5a0a1ddb5ebd5cd4b1e960ffafafb" host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:55.078923 containerd[1674]: 2026-01-28 01:15:55.054 [INFO][4367] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.80.195/26] block=192.168.80.192/26 handle="k8s-pod-network.3823acaad15e705da1174331f2a1388f83c5a0a1ddb5ebd5cd4b1e960ffafafb" host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:55.078923 containerd[1674]: 2026-01-28 01:15:55.055 [INFO][4367] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.80.195/26] handle="k8s-pod-network.3823acaad15e705da1174331f2a1388f83c5a0a1ddb5ebd5cd4b1e960ffafafb" host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:55.078923 containerd[1674]: 2026-01-28 01:15:55.055 [INFO][4367] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 28 01:15:55.078923 containerd[1674]: 2026-01-28 01:15:55.055 [INFO][4367] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.80.195/26] IPv6=[] ContainerID="3823acaad15e705da1174331f2a1388f83c5a0a1ddb5ebd5cd4b1e960ffafafb" HandleID="k8s-pod-network.3823acaad15e705da1174331f2a1388f83c5a0a1ddb5ebd5cd4b1e960ffafafb" Workload="ci--4593--0--0--n--62761e1650-k8s-calico--apiserver--76f584f9b9--9mjk9-eth0" Jan 28 01:15:55.079749 containerd[1674]: 2026-01-28 01:15:55.058 [INFO][4311] cni-plugin/k8s.go 418: Populated endpoint ContainerID="3823acaad15e705da1174331f2a1388f83c5a0a1ddb5ebd5cd4b1e960ffafafb" Namespace="calico-apiserver" Pod="calico-apiserver-76f584f9b9-9mjk9" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-calico--apiserver--76f584f9b9--9mjk9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--n--62761e1650-k8s-calico--apiserver--76f584f9b9--9mjk9-eth0", GenerateName:"calico-apiserver-76f584f9b9-", Namespace:"calico-apiserver", SelfLink:"", UID:"7ea90b44-fc7d-4702-a1a5-1c558b3ecd80", ResourceVersion:"819", Generation:0, CreationTimestamp:time.Date(2026, time.January, 28, 1, 15, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"76f584f9b9", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-n-62761e1650", ContainerID:"", Pod:"calico-apiserver-76f584f9b9-9mjk9", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.80.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali0667187feaa", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 28 01:15:55.079749 containerd[1674]: 2026-01-28 01:15:55.058 [INFO][4311] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.80.195/32] ContainerID="3823acaad15e705da1174331f2a1388f83c5a0a1ddb5ebd5cd4b1e960ffafafb" Namespace="calico-apiserver" Pod="calico-apiserver-76f584f9b9-9mjk9" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-calico--apiserver--76f584f9b9--9mjk9-eth0" Jan 28 01:15:55.079749 containerd[1674]: 2026-01-28 01:15:55.058 [INFO][4311] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0667187feaa ContainerID="3823acaad15e705da1174331f2a1388f83c5a0a1ddb5ebd5cd4b1e960ffafafb" Namespace="calico-apiserver" Pod="calico-apiserver-76f584f9b9-9mjk9" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-calico--apiserver--76f584f9b9--9mjk9-eth0" Jan 28 01:15:55.079749 containerd[1674]: 2026-01-28 01:15:55.060 [INFO][4311] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3823acaad15e705da1174331f2a1388f83c5a0a1ddb5ebd5cd4b1e960ffafafb" Namespace="calico-apiserver" Pod="calico-apiserver-76f584f9b9-9mjk9" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-calico--apiserver--76f584f9b9--9mjk9-eth0" Jan 28 01:15:55.079749 containerd[1674]: 2026-01-28 01:15:55.062 [INFO][4311] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="3823acaad15e705da1174331f2a1388f83c5a0a1ddb5ebd5cd4b1e960ffafafb" Namespace="calico-apiserver" Pod="calico-apiserver-76f584f9b9-9mjk9" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-calico--apiserver--76f584f9b9--9mjk9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--n--62761e1650-k8s-calico--apiserver--76f584f9b9--9mjk9-eth0", GenerateName:"calico-apiserver-76f584f9b9-", Namespace:"calico-apiserver", SelfLink:"", UID:"7ea90b44-fc7d-4702-a1a5-1c558b3ecd80", ResourceVersion:"819", Generation:0, CreationTimestamp:time.Date(2026, time.January, 28, 1, 15, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"76f584f9b9", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-n-62761e1650", ContainerID:"3823acaad15e705da1174331f2a1388f83c5a0a1ddb5ebd5cd4b1e960ffafafb", Pod:"calico-apiserver-76f584f9b9-9mjk9", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.80.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali0667187feaa", MAC:"06:f4:83:26:fe:85", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 28 01:15:55.079749 containerd[1674]: 2026-01-28 01:15:55.076 [INFO][4311] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="3823acaad15e705da1174331f2a1388f83c5a0a1ddb5ebd5cd4b1e960ffafafb" Namespace="calico-apiserver" Pod="calico-apiserver-76f584f9b9-9mjk9" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-calico--apiserver--76f584f9b9--9mjk9-eth0" Jan 28 01:15:55.096000 audit[4441]: NETFILTER_CFG table=filter:126 family=2 entries=54 op=nft_register_chain pid=4441 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 28 01:15:55.096000 audit[4441]: SYSCALL arch=c000003e syscall=46 success=yes exit=29396 a0=3 a1=7ffe89d1ea40 a2=0 a3=7ffe89d1ea2c items=0 ppid=4076 pid=4441 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:55.096000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 28 01:15:55.099248 systemd[1]: Started cri-containerd-6a1da97b8e1e4f5474a4689581bd32bb42d42c01c7c04329ca84152d713ea257.scope - libcontainer container 6a1da97b8e1e4f5474a4689581bd32bb42d42c01c7c04329ca84152d713ea257. Jan 28 01:15:55.109000 audit: BPF prog-id=211 op=LOAD Jan 28 01:15:55.109000 audit: BPF prog-id=212 op=LOAD Jan 28 01:15:55.109000 audit[4422]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=4410 pid=4422 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:55.109000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3661316461393762386531653466353437346134363839353831626433 Jan 28 01:15:55.109000 audit: BPF prog-id=212 op=UNLOAD Jan 28 01:15:55.109000 audit[4422]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4410 pid=4422 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:55.109000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3661316461393762386531653466353437346134363839353831626433 Jan 28 01:15:55.109000 audit: BPF prog-id=213 op=LOAD Jan 28 01:15:55.109000 audit[4422]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=4410 pid=4422 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:55.109000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3661316461393762386531653466353437346134363839353831626433 Jan 28 01:15:55.109000 audit: BPF prog-id=214 op=LOAD Jan 28 01:15:55.109000 audit[4422]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=4410 pid=4422 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:55.109000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3661316461393762386531653466353437346134363839353831626433 Jan 28 01:15:55.110000 audit: BPF prog-id=214 op=UNLOAD Jan 28 01:15:55.110000 audit[4422]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4410 pid=4422 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:55.110000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3661316461393762386531653466353437346134363839353831626433 Jan 28 01:15:55.110000 audit: BPF prog-id=213 op=UNLOAD Jan 28 01:15:55.110000 audit[4422]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4410 pid=4422 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:55.110000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3661316461393762386531653466353437346134363839353831626433 Jan 28 01:15:55.110000 audit: BPF prog-id=215 op=LOAD Jan 28 01:15:55.110000 audit[4422]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=4410 pid=4422 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:55.110000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3661316461393762386531653466353437346134363839353831626433 Jan 28 01:15:55.123577 containerd[1674]: time="2026-01-28T01:15:55.123419481Z" level=info msg="connecting to shim 3823acaad15e705da1174331f2a1388f83c5a0a1ddb5ebd5cd4b1e960ffafafb" address="unix:///run/containerd/s/fc52aab62e5d4dc92113f0280cf7947169dc7065766673859518a87a93864767" namespace=k8s.io protocol=ttrpc version=3 Jan 28 01:15:55.163621 systemd[1]: Started cri-containerd-3823acaad15e705da1174331f2a1388f83c5a0a1ddb5ebd5cd4b1e960ffafafb.scope - libcontainer container 3823acaad15e705da1174331f2a1388f83c5a0a1ddb5ebd5cd4b1e960ffafafb. Jan 28 01:15:55.170859 systemd-networkd[1562]: cali9b0cb966255: Link UP Jan 28 01:15:55.172434 systemd-networkd[1562]: cali9b0cb966255: Gained carrier Jan 28 01:15:55.203546 containerd[1674]: time="2026-01-28T01:15:55.202841425Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-cxbk5,Uid:aab2583c-2dbb-4842-965d-f4f1d01197b0,Namespace:kube-system,Attempt:0,} returns sandbox id \"6a1da97b8e1e4f5474a4689581bd32bb42d42c01c7c04329ca84152d713ea257\"" Jan 28 01:15:55.217000 audit: BPF prog-id=216 op=LOAD Jan 28 01:15:55.219000 audit: BPF prog-id=217 op=LOAD Jan 28 01:15:55.219000 audit[4469]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=4458 pid=4469 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:55.219000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3338323361636161643135653730356461313137343333316632613133 Jan 28 01:15:55.219000 audit: BPF prog-id=217 op=UNLOAD Jan 28 01:15:55.219000 audit[4469]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4458 pid=4469 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:55.219000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3338323361636161643135653730356461313137343333316632613133 Jan 28 01:15:55.221000 audit: BPF prog-id=218 op=LOAD Jan 28 01:15:55.221000 audit[4469]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=4458 pid=4469 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:55.221000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3338323361636161643135653730356461313137343333316632613133 Jan 28 01:15:55.222840 containerd[1674]: time="2026-01-28T01:15:55.222795388Z" level=info msg="CreateContainer within sandbox \"6a1da97b8e1e4f5474a4689581bd32bb42d42c01c7c04329ca84152d713ea257\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 28 01:15:55.222000 audit: BPF prog-id=219 op=LOAD Jan 28 01:15:55.222000 audit[4469]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=4458 pid=4469 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:55.222000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3338323361636161643135653730356461313137343333316632613133 Jan 28 01:15:55.222000 audit: BPF prog-id=219 op=UNLOAD Jan 28 01:15:55.222000 audit[4469]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4458 pid=4469 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:55.222000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3338323361636161643135653730356461313137343333316632613133 Jan 28 01:15:55.222000 audit: BPF prog-id=218 op=UNLOAD Jan 28 01:15:55.222000 audit[4469]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4458 pid=4469 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:55.222000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3338323361636161643135653730356461313137343333316632613133 Jan 28 01:15:55.222000 audit: BPF prog-id=220 op=LOAD Jan 28 01:15:55.222000 audit[4469]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=4458 pid=4469 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:55.222000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3338323361636161643135653730356461313137343333316632613133 Jan 28 01:15:55.228442 containerd[1674]: 2026-01-28 01:15:54.901 [INFO][4336] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4593--0--0--n--62761e1650-k8s-calico--apiserver--76f584f9b9--6dh2c-eth0 calico-apiserver-76f584f9b9- calico-apiserver bead6395-8434-48df-aa67-e987782da70c 821 0 2026-01-28 01:15:28 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:76f584f9b9 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4593-0-0-n-62761e1650 calico-apiserver-76f584f9b9-6dh2c eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali9b0cb966255 [] [] }} ContainerID="3da11cece3abf71e0b7c0de776962d0b998ac72120335f5d218a05d8fad77136" Namespace="calico-apiserver" Pod="calico-apiserver-76f584f9b9-6dh2c" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-calico--apiserver--76f584f9b9--6dh2c-" Jan 28 01:15:55.228442 containerd[1674]: 2026-01-28 01:15:54.902 [INFO][4336] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="3da11cece3abf71e0b7c0de776962d0b998ac72120335f5d218a05d8fad77136" Namespace="calico-apiserver" Pod="calico-apiserver-76f584f9b9-6dh2c" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-calico--apiserver--76f584f9b9--6dh2c-eth0" Jan 28 01:15:55.228442 containerd[1674]: 2026-01-28 01:15:54.996 [INFO][4373] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3da11cece3abf71e0b7c0de776962d0b998ac72120335f5d218a05d8fad77136" HandleID="k8s-pod-network.3da11cece3abf71e0b7c0de776962d0b998ac72120335f5d218a05d8fad77136" Workload="ci--4593--0--0--n--62761e1650-k8s-calico--apiserver--76f584f9b9--6dh2c-eth0" Jan 28 01:15:55.228442 containerd[1674]: 2026-01-28 01:15:54.998 [INFO][4373] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="3da11cece3abf71e0b7c0de776962d0b998ac72120335f5d218a05d8fad77136" HandleID="k8s-pod-network.3da11cece3abf71e0b7c0de776962d0b998ac72120335f5d218a05d8fad77136" Workload="ci--4593--0--0--n--62761e1650-k8s-calico--apiserver--76f584f9b9--6dh2c-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000333410), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4593-0-0-n-62761e1650", "pod":"calico-apiserver-76f584f9b9-6dh2c", "timestamp":"2026-01-28 01:15:54.99680684 +0000 UTC"}, Hostname:"ci-4593-0-0-n-62761e1650", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 28 01:15:55.228442 containerd[1674]: 2026-01-28 01:15:54.998 [INFO][4373] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 28 01:15:55.228442 containerd[1674]: 2026-01-28 01:15:55.055 [INFO][4373] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 28 01:15:55.228442 containerd[1674]: 2026-01-28 01:15:55.055 [INFO][4373] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4593-0-0-n-62761e1650' Jan 28 01:15:55.228442 containerd[1674]: 2026-01-28 01:15:55.113 [INFO][4373] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.3da11cece3abf71e0b7c0de776962d0b998ac72120335f5d218a05d8fad77136" host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:55.228442 containerd[1674]: 2026-01-28 01:15:55.119 [INFO][4373] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:55.228442 containerd[1674]: 2026-01-28 01:15:55.127 [INFO][4373] ipam/ipam.go 511: Trying affinity for 192.168.80.192/26 host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:55.228442 containerd[1674]: 2026-01-28 01:15:55.130 [INFO][4373] ipam/ipam.go 158: Attempting to load block cidr=192.168.80.192/26 host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:55.228442 containerd[1674]: 2026-01-28 01:15:55.133 [INFO][4373] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.80.192/26 host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:55.228442 containerd[1674]: 2026-01-28 01:15:55.133 [INFO][4373] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.80.192/26 handle="k8s-pod-network.3da11cece3abf71e0b7c0de776962d0b998ac72120335f5d218a05d8fad77136" host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:55.228442 containerd[1674]: 2026-01-28 01:15:55.134 [INFO][4373] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.3da11cece3abf71e0b7c0de776962d0b998ac72120335f5d218a05d8fad77136 Jan 28 01:15:55.228442 containerd[1674]: 2026-01-28 01:15:55.140 [INFO][4373] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.80.192/26 handle="k8s-pod-network.3da11cece3abf71e0b7c0de776962d0b998ac72120335f5d218a05d8fad77136" host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:55.228442 containerd[1674]: 2026-01-28 01:15:55.153 [INFO][4373] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.80.196/26] block=192.168.80.192/26 handle="k8s-pod-network.3da11cece3abf71e0b7c0de776962d0b998ac72120335f5d218a05d8fad77136" host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:55.228442 containerd[1674]: 2026-01-28 01:15:55.153 [INFO][4373] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.80.196/26] handle="k8s-pod-network.3da11cece3abf71e0b7c0de776962d0b998ac72120335f5d218a05d8fad77136" host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:55.228442 containerd[1674]: 2026-01-28 01:15:55.153 [INFO][4373] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 28 01:15:55.228442 containerd[1674]: 2026-01-28 01:15:55.153 [INFO][4373] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.80.196/26] IPv6=[] ContainerID="3da11cece3abf71e0b7c0de776962d0b998ac72120335f5d218a05d8fad77136" HandleID="k8s-pod-network.3da11cece3abf71e0b7c0de776962d0b998ac72120335f5d218a05d8fad77136" Workload="ci--4593--0--0--n--62761e1650-k8s-calico--apiserver--76f584f9b9--6dh2c-eth0" Jan 28 01:15:55.228962 containerd[1674]: 2026-01-28 01:15:55.159 [INFO][4336] cni-plugin/k8s.go 418: Populated endpoint ContainerID="3da11cece3abf71e0b7c0de776962d0b998ac72120335f5d218a05d8fad77136" Namespace="calico-apiserver" Pod="calico-apiserver-76f584f9b9-6dh2c" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-calico--apiserver--76f584f9b9--6dh2c-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--n--62761e1650-k8s-calico--apiserver--76f584f9b9--6dh2c-eth0", GenerateName:"calico-apiserver-76f584f9b9-", Namespace:"calico-apiserver", SelfLink:"", UID:"bead6395-8434-48df-aa67-e987782da70c", ResourceVersion:"821", Generation:0, CreationTimestamp:time.Date(2026, time.January, 28, 1, 15, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"76f584f9b9", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-n-62761e1650", ContainerID:"", Pod:"calico-apiserver-76f584f9b9-6dh2c", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.80.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali9b0cb966255", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 28 01:15:55.228962 containerd[1674]: 2026-01-28 01:15:55.161 [INFO][4336] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.80.196/32] ContainerID="3da11cece3abf71e0b7c0de776962d0b998ac72120335f5d218a05d8fad77136" Namespace="calico-apiserver" Pod="calico-apiserver-76f584f9b9-6dh2c" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-calico--apiserver--76f584f9b9--6dh2c-eth0" Jan 28 01:15:55.228962 containerd[1674]: 2026-01-28 01:15:55.161 [INFO][4336] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali9b0cb966255 ContainerID="3da11cece3abf71e0b7c0de776962d0b998ac72120335f5d218a05d8fad77136" Namespace="calico-apiserver" Pod="calico-apiserver-76f584f9b9-6dh2c" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-calico--apiserver--76f584f9b9--6dh2c-eth0" Jan 28 01:15:55.228962 containerd[1674]: 2026-01-28 01:15:55.173 [INFO][4336] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3da11cece3abf71e0b7c0de776962d0b998ac72120335f5d218a05d8fad77136" Namespace="calico-apiserver" Pod="calico-apiserver-76f584f9b9-6dh2c" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-calico--apiserver--76f584f9b9--6dh2c-eth0" Jan 28 01:15:55.228962 containerd[1674]: 2026-01-28 01:15:55.187 [INFO][4336] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="3da11cece3abf71e0b7c0de776962d0b998ac72120335f5d218a05d8fad77136" Namespace="calico-apiserver" Pod="calico-apiserver-76f584f9b9-6dh2c" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-calico--apiserver--76f584f9b9--6dh2c-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--n--62761e1650-k8s-calico--apiserver--76f584f9b9--6dh2c-eth0", GenerateName:"calico-apiserver-76f584f9b9-", Namespace:"calico-apiserver", SelfLink:"", UID:"bead6395-8434-48df-aa67-e987782da70c", ResourceVersion:"821", Generation:0, CreationTimestamp:time.Date(2026, time.January, 28, 1, 15, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"76f584f9b9", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-n-62761e1650", ContainerID:"3da11cece3abf71e0b7c0de776962d0b998ac72120335f5d218a05d8fad77136", Pod:"calico-apiserver-76f584f9b9-6dh2c", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.80.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali9b0cb966255", MAC:"ee:44:ae:1b:cf:e2", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 28 01:15:55.228962 containerd[1674]: 2026-01-28 01:15:55.219 [INFO][4336] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="3da11cece3abf71e0b7c0de776962d0b998ac72120335f5d218a05d8fad77136" Namespace="calico-apiserver" Pod="calico-apiserver-76f584f9b9-6dh2c" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-calico--apiserver--76f584f9b9--6dh2c-eth0" Jan 28 01:15:55.250721 systemd-networkd[1562]: vxlan.calico: Gained IPv6LL Jan 28 01:15:55.267000 audit[4503]: NETFILTER_CFG table=filter:127 family=2 entries=45 op=nft_register_chain pid=4503 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 28 01:15:55.267000 audit[4503]: SYSCALL arch=c000003e syscall=46 success=yes exit=24264 a0=3 a1=7ffec6dcda40 a2=0 a3=7ffec6dcda2c items=0 ppid=4076 pid=4503 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:55.267000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 28 01:15:55.281687 containerd[1674]: time="2026-01-28T01:15:55.281653414Z" level=info msg="Container 7f7d8a2a8b541c18ee6f802a1c4159d5d609efeaa74f7ee992fe17406874aca7: CDI devices from CRI Config.CDIDevices: []" Jan 28 01:15:55.300265 systemd-networkd[1562]: calic611e4593bd: Link UP Jan 28 01:15:55.300610 systemd-networkd[1562]: calic611e4593bd: Gained carrier Jan 28 01:15:55.307795 containerd[1674]: time="2026-01-28T01:15:55.307760564Z" level=info msg="CreateContainer within sandbox \"6a1da97b8e1e4f5474a4689581bd32bb42d42c01c7c04329ca84152d713ea257\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"7f7d8a2a8b541c18ee6f802a1c4159d5d609efeaa74f7ee992fe17406874aca7\"" Jan 28 01:15:55.310539 containerd[1674]: time="2026-01-28T01:15:55.310266927Z" level=info msg="StartContainer for \"7f7d8a2a8b541c18ee6f802a1c4159d5d609efeaa74f7ee992fe17406874aca7\"" Jan 28 01:15:55.313166 containerd[1674]: time="2026-01-28T01:15:55.313106776Z" level=info msg="connecting to shim 7f7d8a2a8b541c18ee6f802a1c4159d5d609efeaa74f7ee992fe17406874aca7" address="unix:///run/containerd/s/12c96e6742a1832a715af7c203ed64e03fd240d1cd677385fae8d669bf27553e" protocol=ttrpc version=3 Jan 28 01:15:55.334709 containerd[1674]: time="2026-01-28T01:15:55.334630528Z" level=info msg="connecting to shim 3da11cece3abf71e0b7c0de776962d0b998ac72120335f5d218a05d8fad77136" address="unix:///run/containerd/s/34267cd9efecf129acf33b9c6853c898f54eb44f31d4bb9ab24055d6065681e2" namespace=k8s.io protocol=ttrpc version=3 Jan 28 01:15:55.338850 containerd[1674]: 2026-01-28 01:15:54.903 [INFO][4331] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4593--0--0--n--62761e1650-k8s-goldmane--666569f655--cblpt-eth0 goldmane-666569f655- calico-system c22de3ae-0a27-443f-9dd3-c4ab0a4176bd 820 0 2026-01-28 01:15:31 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:666569f655 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4593-0-0-n-62761e1650 goldmane-666569f655-cblpt eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calic611e4593bd [] [] }} ContainerID="5a9fbdf8dcad4b0cd74eef60c32f1e9e2a4dcefcfb5ef380f588f3d184322214" Namespace="calico-system" Pod="goldmane-666569f655-cblpt" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-goldmane--666569f655--cblpt-" Jan 28 01:15:55.338850 containerd[1674]: 2026-01-28 01:15:54.903 [INFO][4331] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5a9fbdf8dcad4b0cd74eef60c32f1e9e2a4dcefcfb5ef380f588f3d184322214" Namespace="calico-system" Pod="goldmane-666569f655-cblpt" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-goldmane--666569f655--cblpt-eth0" Jan 28 01:15:55.338850 containerd[1674]: 2026-01-28 01:15:55.003 [INFO][4379] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5a9fbdf8dcad4b0cd74eef60c32f1e9e2a4dcefcfb5ef380f588f3d184322214" HandleID="k8s-pod-network.5a9fbdf8dcad4b0cd74eef60c32f1e9e2a4dcefcfb5ef380f588f3d184322214" Workload="ci--4593--0--0--n--62761e1650-k8s-goldmane--666569f655--cblpt-eth0" Jan 28 01:15:55.338850 containerd[1674]: 2026-01-28 01:15:55.003 [INFO][4379] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="5a9fbdf8dcad4b0cd74eef60c32f1e9e2a4dcefcfb5ef380f588f3d184322214" HandleID="k8s-pod-network.5a9fbdf8dcad4b0cd74eef60c32f1e9e2a4dcefcfb5ef380f588f3d184322214" Workload="ci--4593--0--0--n--62761e1650-k8s-goldmane--666569f655--cblpt-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d57e0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4593-0-0-n-62761e1650", "pod":"goldmane-666569f655-cblpt", "timestamp":"2026-01-28 01:15:55.003285563 +0000 UTC"}, Hostname:"ci-4593-0-0-n-62761e1650", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 28 01:15:55.338850 containerd[1674]: 2026-01-28 01:15:55.003 [INFO][4379] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 28 01:15:55.338850 containerd[1674]: 2026-01-28 01:15:55.153 [INFO][4379] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 28 01:15:55.338850 containerd[1674]: 2026-01-28 01:15:55.154 [INFO][4379] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4593-0-0-n-62761e1650' Jan 28 01:15:55.338850 containerd[1674]: 2026-01-28 01:15:55.217 [INFO][4379] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.5a9fbdf8dcad4b0cd74eef60c32f1e9e2a4dcefcfb5ef380f588f3d184322214" host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:55.338850 containerd[1674]: 2026-01-28 01:15:55.227 [INFO][4379] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:55.338850 containerd[1674]: 2026-01-28 01:15:55.241 [INFO][4379] ipam/ipam.go 511: Trying affinity for 192.168.80.192/26 host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:55.338850 containerd[1674]: 2026-01-28 01:15:55.244 [INFO][4379] ipam/ipam.go 158: Attempting to load block cidr=192.168.80.192/26 host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:55.338850 containerd[1674]: 2026-01-28 01:15:55.252 [INFO][4379] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.80.192/26 host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:55.338850 containerd[1674]: 2026-01-28 01:15:55.252 [INFO][4379] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.80.192/26 handle="k8s-pod-network.5a9fbdf8dcad4b0cd74eef60c32f1e9e2a4dcefcfb5ef380f588f3d184322214" host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:55.338850 containerd[1674]: 2026-01-28 01:15:55.256 [INFO][4379] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.5a9fbdf8dcad4b0cd74eef60c32f1e9e2a4dcefcfb5ef380f588f3d184322214 Jan 28 01:15:55.338850 containerd[1674]: 2026-01-28 01:15:55.275 [INFO][4379] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.80.192/26 handle="k8s-pod-network.5a9fbdf8dcad4b0cd74eef60c32f1e9e2a4dcefcfb5ef380f588f3d184322214" host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:55.338850 containerd[1674]: 2026-01-28 01:15:55.286 [INFO][4379] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.80.197/26] block=192.168.80.192/26 handle="k8s-pod-network.5a9fbdf8dcad4b0cd74eef60c32f1e9e2a4dcefcfb5ef380f588f3d184322214" host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:55.338850 containerd[1674]: 2026-01-28 01:15:55.287 [INFO][4379] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.80.197/26] handle="k8s-pod-network.5a9fbdf8dcad4b0cd74eef60c32f1e9e2a4dcefcfb5ef380f588f3d184322214" host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:55.338850 containerd[1674]: 2026-01-28 01:15:55.287 [INFO][4379] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 28 01:15:55.338850 containerd[1674]: 2026-01-28 01:15:55.287 [INFO][4379] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.80.197/26] IPv6=[] ContainerID="5a9fbdf8dcad4b0cd74eef60c32f1e9e2a4dcefcfb5ef380f588f3d184322214" HandleID="k8s-pod-network.5a9fbdf8dcad4b0cd74eef60c32f1e9e2a4dcefcfb5ef380f588f3d184322214" Workload="ci--4593--0--0--n--62761e1650-k8s-goldmane--666569f655--cblpt-eth0" Jan 28 01:15:55.339585 containerd[1674]: 2026-01-28 01:15:55.291 [INFO][4331] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5a9fbdf8dcad4b0cd74eef60c32f1e9e2a4dcefcfb5ef380f588f3d184322214" Namespace="calico-system" Pod="goldmane-666569f655-cblpt" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-goldmane--666569f655--cblpt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--n--62761e1650-k8s-goldmane--666569f655--cblpt-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"c22de3ae-0a27-443f-9dd3-c4ab0a4176bd", ResourceVersion:"820", Generation:0, CreationTimestamp:time.Date(2026, time.January, 28, 1, 15, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-n-62761e1650", ContainerID:"", Pod:"goldmane-666569f655-cblpt", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.80.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calic611e4593bd", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 28 01:15:55.339585 containerd[1674]: 2026-01-28 01:15:55.291 [INFO][4331] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.80.197/32] ContainerID="5a9fbdf8dcad4b0cd74eef60c32f1e9e2a4dcefcfb5ef380f588f3d184322214" Namespace="calico-system" Pod="goldmane-666569f655-cblpt" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-goldmane--666569f655--cblpt-eth0" Jan 28 01:15:55.339585 containerd[1674]: 2026-01-28 01:15:55.291 [INFO][4331] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic611e4593bd ContainerID="5a9fbdf8dcad4b0cd74eef60c32f1e9e2a4dcefcfb5ef380f588f3d184322214" Namespace="calico-system" Pod="goldmane-666569f655-cblpt" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-goldmane--666569f655--cblpt-eth0" Jan 28 01:15:55.339585 containerd[1674]: 2026-01-28 01:15:55.302 [INFO][4331] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5a9fbdf8dcad4b0cd74eef60c32f1e9e2a4dcefcfb5ef380f588f3d184322214" Namespace="calico-system" Pod="goldmane-666569f655-cblpt" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-goldmane--666569f655--cblpt-eth0" Jan 28 01:15:55.339585 containerd[1674]: 2026-01-28 01:15:55.303 [INFO][4331] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5a9fbdf8dcad4b0cd74eef60c32f1e9e2a4dcefcfb5ef380f588f3d184322214" Namespace="calico-system" Pod="goldmane-666569f655-cblpt" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-goldmane--666569f655--cblpt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--n--62761e1650-k8s-goldmane--666569f655--cblpt-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"c22de3ae-0a27-443f-9dd3-c4ab0a4176bd", ResourceVersion:"820", Generation:0, CreationTimestamp:time.Date(2026, time.January, 28, 1, 15, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-n-62761e1650", ContainerID:"5a9fbdf8dcad4b0cd74eef60c32f1e9e2a4dcefcfb5ef380f588f3d184322214", Pod:"goldmane-666569f655-cblpt", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.80.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calic611e4593bd", MAC:"ea:86:e1:73:5e:1a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 28 01:15:55.339585 containerd[1674]: 2026-01-28 01:15:55.331 [INFO][4331] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5a9fbdf8dcad4b0cd74eef60c32f1e9e2a4dcefcfb5ef380f588f3d184322214" Namespace="calico-system" Pod="goldmane-666569f655-cblpt" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-goldmane--666569f655--cblpt-eth0" Jan 28 01:15:55.360542 systemd[1]: Started cri-containerd-7f7d8a2a8b541c18ee6f802a1c4159d5d609efeaa74f7ee992fe17406874aca7.scope - libcontainer container 7f7d8a2a8b541c18ee6f802a1c4159d5d609efeaa74f7ee992fe17406874aca7. Jan 28 01:15:55.381242 systemd[1]: Started cri-containerd-3da11cece3abf71e0b7c0de776962d0b998ac72120335f5d218a05d8fad77136.scope - libcontainer container 3da11cece3abf71e0b7c0de776962d0b998ac72120335f5d218a05d8fad77136. Jan 28 01:15:55.381000 audit[4573]: NETFILTER_CFG table=filter:128 family=2 entries=56 op=nft_register_chain pid=4573 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 28 01:15:55.381000 audit[4573]: SYSCALL arch=c000003e syscall=46 success=yes exit=28744 a0=3 a1=7ffcbafca970 a2=0 a3=7ffcbafca95c items=0 ppid=4076 pid=4573 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:55.381000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 28 01:15:55.385000 audit: BPF prog-id=221 op=LOAD Jan 28 01:15:55.389000 audit: BPF prog-id=222 op=LOAD Jan 28 01:15:55.389000 audit[4512]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000106238 a2=98 a3=0 items=0 ppid=4410 pid=4512 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:55.389000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3766376438613261386235343163313865653666383032613163343135 Jan 28 01:15:55.389000 audit: BPF prog-id=222 op=UNLOAD Jan 28 01:15:55.389000 audit[4512]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4410 pid=4512 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:55.389000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3766376438613261386235343163313865653666383032613163343135 Jan 28 01:15:55.391754 containerd[1674]: time="2026-01-28T01:15:55.390062666Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-76f584f9b9-9mjk9,Uid:7ea90b44-fc7d-4702-a1a5-1c558b3ecd80,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"3823acaad15e705da1174331f2a1388f83c5a0a1ddb5ebd5cd4b1e960ffafafb\"" Jan 28 01:15:55.391754 containerd[1674]: time="2026-01-28T01:15:55.390998297Z" level=info msg="connecting to shim 5a9fbdf8dcad4b0cd74eef60c32f1e9e2a4dcefcfb5ef380f588f3d184322214" address="unix:///run/containerd/s/b41ebbeec714e45cc30e340c7f2c3672fe5d2afe17f9d27a8e953d699b654c54" namespace=k8s.io protocol=ttrpc version=3 Jan 28 01:15:55.391000 audit: BPF prog-id=223 op=LOAD Jan 28 01:15:55.391000 audit[4512]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000106488 a2=98 a3=0 items=0 ppid=4410 pid=4512 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:55.391000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3766376438613261386235343163313865653666383032613163343135 Jan 28 01:15:55.391000 audit: BPF prog-id=224 op=LOAD Jan 28 01:15:55.391000 audit[4512]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000106218 a2=98 a3=0 items=0 ppid=4410 pid=4512 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:55.391000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3766376438613261386235343163313865653666383032613163343135 Jan 28 01:15:55.391000 audit: BPF prog-id=224 op=UNLOAD Jan 28 01:15:55.391000 audit[4512]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4410 pid=4512 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:55.391000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3766376438613261386235343163313865653666383032613163343135 Jan 28 01:15:55.391000 audit: BPF prog-id=223 op=UNLOAD Jan 28 01:15:55.391000 audit[4512]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4410 pid=4512 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:55.391000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3766376438613261386235343163313865653666383032613163343135 Jan 28 01:15:55.394606 containerd[1674]: time="2026-01-28T01:15:55.394195901Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 28 01:15:55.392000 audit: BPF prog-id=225 op=LOAD Jan 28 01:15:55.392000 audit[4512]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001066e8 a2=98 a3=0 items=0 ppid=4410 pid=4512 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:55.392000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3766376438613261386235343163313865653666383032613163343135 Jan 28 01:15:55.409000 audit: BPF prog-id=226 op=LOAD Jan 28 01:15:55.410000 audit: BPF prog-id=227 op=LOAD Jan 28 01:15:55.410000 audit[4546]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=4517 pid=4546 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:55.410000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3364613131636563653361626637316530623763306465373736393632 Jan 28 01:15:55.410000 audit: BPF prog-id=227 op=UNLOAD Jan 28 01:15:55.410000 audit[4546]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4517 pid=4546 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:55.410000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3364613131636563653361626637316530623763306465373736393632 Jan 28 01:15:55.410000 audit: BPF prog-id=228 op=LOAD Jan 28 01:15:55.410000 audit[4546]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=4517 pid=4546 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:55.410000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3364613131636563653361626637316530623763306465373736393632 Jan 28 01:15:55.410000 audit: BPF prog-id=229 op=LOAD Jan 28 01:15:55.410000 audit[4546]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=4517 pid=4546 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:55.410000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3364613131636563653361626637316530623763306465373736393632 Jan 28 01:15:55.410000 audit: BPF prog-id=229 op=UNLOAD Jan 28 01:15:55.410000 audit[4546]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4517 pid=4546 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:55.410000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3364613131636563653361626637316530623763306465373736393632 Jan 28 01:15:55.410000 audit: BPF prog-id=228 op=UNLOAD Jan 28 01:15:55.410000 audit[4546]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4517 pid=4546 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:55.410000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3364613131636563653361626637316530623763306465373736393632 Jan 28 01:15:55.410000 audit: BPF prog-id=230 op=LOAD Jan 28 01:15:55.410000 audit[4546]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=4517 pid=4546 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:55.410000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3364613131636563653361626637316530623763306465373736393632 Jan 28 01:15:55.431897 containerd[1674]: time="2026-01-28T01:15:55.431865393Z" level=info msg="StartContainer for \"7f7d8a2a8b541c18ee6f802a1c4159d5d609efeaa74f7ee992fe17406874aca7\" returns successfully" Jan 28 01:15:55.433232 systemd[1]: Started cri-containerd-5a9fbdf8dcad4b0cd74eef60c32f1e9e2a4dcefcfb5ef380f588f3d184322214.scope - libcontainer container 5a9fbdf8dcad4b0cd74eef60c32f1e9e2a4dcefcfb5ef380f588f3d184322214. Jan 28 01:15:55.442114 systemd-networkd[1562]: cali160fead5a46: Gained IPv6LL Jan 28 01:15:55.447000 audit: BPF prog-id=231 op=LOAD Jan 28 01:15:55.447000 audit: BPF prog-id=232 op=LOAD Jan 28 01:15:55.447000 audit[4601]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=4585 pid=4601 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:55.447000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3561396662646638646361643462306364373465656636306333326631 Jan 28 01:15:55.447000 audit: BPF prog-id=232 op=UNLOAD Jan 28 01:15:55.447000 audit[4601]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4585 pid=4601 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:55.447000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3561396662646638646361643462306364373465656636306333326631 Jan 28 01:15:55.448000 audit: BPF prog-id=233 op=LOAD Jan 28 01:15:55.448000 audit[4601]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=4585 pid=4601 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:55.448000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3561396662646638646361643462306364373465656636306333326631 Jan 28 01:15:55.448000 audit: BPF prog-id=234 op=LOAD Jan 28 01:15:55.448000 audit[4601]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=4585 pid=4601 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:55.448000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3561396662646638646361643462306364373465656636306333326631 Jan 28 01:15:55.448000 audit: BPF prog-id=234 op=UNLOAD Jan 28 01:15:55.448000 audit[4601]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4585 pid=4601 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:55.448000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3561396662646638646361643462306364373465656636306333326631 Jan 28 01:15:55.448000 audit: BPF prog-id=233 op=UNLOAD Jan 28 01:15:55.448000 audit[4601]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4585 pid=4601 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:55.448000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3561396662646638646361643462306364373465656636306333326631 Jan 28 01:15:55.448000 audit: BPF prog-id=235 op=LOAD Jan 28 01:15:55.448000 audit[4601]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=4585 pid=4601 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:55.448000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3561396662646638646361643462306364373465656636306333326631 Jan 28 01:15:55.467572 containerd[1674]: time="2026-01-28T01:15:55.467542225Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-76f584f9b9-6dh2c,Uid:bead6395-8434-48df-aa67-e987782da70c,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"3da11cece3abf71e0b7c0de776962d0b998ac72120335f5d218a05d8fad77136\"" Jan 28 01:15:55.492616 containerd[1674]: time="2026-01-28T01:15:55.492578060Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-cblpt,Uid:c22de3ae-0a27-443f-9dd3-c4ab0a4176bd,Namespace:calico-system,Attempt:0,} returns sandbox id \"5a9fbdf8dcad4b0cd74eef60c32f1e9e2a4dcefcfb5ef380f588f3d184322214\"" Jan 28 01:15:55.739248 containerd[1674]: time="2026-01-28T01:15:55.739193001Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:15:55.741022 containerd[1674]: time="2026-01-28T01:15:55.740977896Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 28 01:15:55.741112 containerd[1674]: time="2026-01-28T01:15:55.741091639Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 28 01:15:55.741272 kubelet[2910]: E0128 01:15:55.741237 2910 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 01:15:55.742138 kubelet[2910]: E0128 01:15:55.741290 2910 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 01:15:55.742138 kubelet[2910]: E0128 01:15:55.741596 2910 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zgdzc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-76f584f9b9-9mjk9_calico-apiserver(7ea90b44-fc7d-4702-a1a5-1c558b3ecd80): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 28 01:15:55.742249 containerd[1674]: time="2026-01-28T01:15:55.741613335Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 28 01:15:55.743518 kubelet[2910]: E0128 01:15:55.743494 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76f584f9b9-9mjk9" podUID="7ea90b44-fc7d-4702-a1a5-1c558b3ecd80" Jan 28 01:15:55.747642 containerd[1674]: time="2026-01-28T01:15:55.747608755Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-24fxv,Uid:999bf8e1-3bd4-4ef0-8db0-79618ac53f0b,Namespace:kube-system,Attempt:0,}" Jan 28 01:15:55.879305 systemd-networkd[1562]: calid6a943a0979: Link UP Jan 28 01:15:55.879441 systemd-networkd[1562]: calid6a943a0979: Gained carrier Jan 28 01:15:55.894406 containerd[1674]: 2026-01-28 01:15:55.800 [INFO][4648] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4593--0--0--n--62761e1650-k8s-coredns--674b8bbfcf--24fxv-eth0 coredns-674b8bbfcf- kube-system 999bf8e1-3bd4-4ef0-8db0-79618ac53f0b 810 0 2026-01-28 01:15:18 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4593-0-0-n-62761e1650 coredns-674b8bbfcf-24fxv eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calid6a943a0979 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="9f1b9f625b1dafcabba6bf58eeca58ade6520f4ec91516dead4b7a17bc5be041" Namespace="kube-system" Pod="coredns-674b8bbfcf-24fxv" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-coredns--674b8bbfcf--24fxv-" Jan 28 01:15:55.894406 containerd[1674]: 2026-01-28 01:15:55.800 [INFO][4648] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="9f1b9f625b1dafcabba6bf58eeca58ade6520f4ec91516dead4b7a17bc5be041" Namespace="kube-system" Pod="coredns-674b8bbfcf-24fxv" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-coredns--674b8bbfcf--24fxv-eth0" Jan 28 01:15:55.894406 containerd[1674]: 2026-01-28 01:15:55.839 [INFO][4662] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9f1b9f625b1dafcabba6bf58eeca58ade6520f4ec91516dead4b7a17bc5be041" HandleID="k8s-pod-network.9f1b9f625b1dafcabba6bf58eeca58ade6520f4ec91516dead4b7a17bc5be041" Workload="ci--4593--0--0--n--62761e1650-k8s-coredns--674b8bbfcf--24fxv-eth0" Jan 28 01:15:55.894406 containerd[1674]: 2026-01-28 01:15:55.839 [INFO][4662] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="9f1b9f625b1dafcabba6bf58eeca58ade6520f4ec91516dead4b7a17bc5be041" HandleID="k8s-pod-network.9f1b9f625b1dafcabba6bf58eeca58ade6520f4ec91516dead4b7a17bc5be041" Workload="ci--4593--0--0--n--62761e1650-k8s-coredns--674b8bbfcf--24fxv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002c5870), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4593-0-0-n-62761e1650", "pod":"coredns-674b8bbfcf-24fxv", "timestamp":"2026-01-28 01:15:55.83948768 +0000 UTC"}, Hostname:"ci-4593-0-0-n-62761e1650", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 28 01:15:55.894406 containerd[1674]: 2026-01-28 01:15:55.839 [INFO][4662] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 28 01:15:55.894406 containerd[1674]: 2026-01-28 01:15:55.839 [INFO][4662] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 28 01:15:55.894406 containerd[1674]: 2026-01-28 01:15:55.839 [INFO][4662] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4593-0-0-n-62761e1650' Jan 28 01:15:55.894406 containerd[1674]: 2026-01-28 01:15:55.846 [INFO][4662] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.9f1b9f625b1dafcabba6bf58eeca58ade6520f4ec91516dead4b7a17bc5be041" host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:55.894406 containerd[1674]: 2026-01-28 01:15:55.851 [INFO][4662] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:55.894406 containerd[1674]: 2026-01-28 01:15:55.856 [INFO][4662] ipam/ipam.go 511: Trying affinity for 192.168.80.192/26 host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:55.894406 containerd[1674]: 2026-01-28 01:15:55.858 [INFO][4662] ipam/ipam.go 158: Attempting to load block cidr=192.168.80.192/26 host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:55.894406 containerd[1674]: 2026-01-28 01:15:55.860 [INFO][4662] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.80.192/26 host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:55.894406 containerd[1674]: 2026-01-28 01:15:55.861 [INFO][4662] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.80.192/26 handle="k8s-pod-network.9f1b9f625b1dafcabba6bf58eeca58ade6520f4ec91516dead4b7a17bc5be041" host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:55.894406 containerd[1674]: 2026-01-28 01:15:55.862 [INFO][4662] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.9f1b9f625b1dafcabba6bf58eeca58ade6520f4ec91516dead4b7a17bc5be041 Jan 28 01:15:55.894406 containerd[1674]: 2026-01-28 01:15:55.865 [INFO][4662] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.80.192/26 handle="k8s-pod-network.9f1b9f625b1dafcabba6bf58eeca58ade6520f4ec91516dead4b7a17bc5be041" host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:55.894406 containerd[1674]: 2026-01-28 01:15:55.873 [INFO][4662] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.80.198/26] block=192.168.80.192/26 handle="k8s-pod-network.9f1b9f625b1dafcabba6bf58eeca58ade6520f4ec91516dead4b7a17bc5be041" host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:55.894406 containerd[1674]: 2026-01-28 01:15:55.873 [INFO][4662] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.80.198/26] handle="k8s-pod-network.9f1b9f625b1dafcabba6bf58eeca58ade6520f4ec91516dead4b7a17bc5be041" host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:55.894406 containerd[1674]: 2026-01-28 01:15:55.873 [INFO][4662] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 28 01:15:55.894406 containerd[1674]: 2026-01-28 01:15:55.873 [INFO][4662] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.80.198/26] IPv6=[] ContainerID="9f1b9f625b1dafcabba6bf58eeca58ade6520f4ec91516dead4b7a17bc5be041" HandleID="k8s-pod-network.9f1b9f625b1dafcabba6bf58eeca58ade6520f4ec91516dead4b7a17bc5be041" Workload="ci--4593--0--0--n--62761e1650-k8s-coredns--674b8bbfcf--24fxv-eth0" Jan 28 01:15:55.895154 containerd[1674]: 2026-01-28 01:15:55.875 [INFO][4648] cni-plugin/k8s.go 418: Populated endpoint ContainerID="9f1b9f625b1dafcabba6bf58eeca58ade6520f4ec91516dead4b7a17bc5be041" Namespace="kube-system" Pod="coredns-674b8bbfcf-24fxv" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-coredns--674b8bbfcf--24fxv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--n--62761e1650-k8s-coredns--674b8bbfcf--24fxv-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"999bf8e1-3bd4-4ef0-8db0-79618ac53f0b", ResourceVersion:"810", Generation:0, CreationTimestamp:time.Date(2026, time.January, 28, 1, 15, 18, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-n-62761e1650", ContainerID:"", Pod:"coredns-674b8bbfcf-24fxv", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.80.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calid6a943a0979", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 28 01:15:55.895154 containerd[1674]: 2026-01-28 01:15:55.875 [INFO][4648] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.80.198/32] ContainerID="9f1b9f625b1dafcabba6bf58eeca58ade6520f4ec91516dead4b7a17bc5be041" Namespace="kube-system" Pod="coredns-674b8bbfcf-24fxv" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-coredns--674b8bbfcf--24fxv-eth0" Jan 28 01:15:55.895154 containerd[1674]: 2026-01-28 01:15:55.875 [INFO][4648] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid6a943a0979 ContainerID="9f1b9f625b1dafcabba6bf58eeca58ade6520f4ec91516dead4b7a17bc5be041" Namespace="kube-system" Pod="coredns-674b8bbfcf-24fxv" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-coredns--674b8bbfcf--24fxv-eth0" Jan 28 01:15:55.895154 containerd[1674]: 2026-01-28 01:15:55.877 [INFO][4648] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9f1b9f625b1dafcabba6bf58eeca58ade6520f4ec91516dead4b7a17bc5be041" Namespace="kube-system" Pod="coredns-674b8bbfcf-24fxv" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-coredns--674b8bbfcf--24fxv-eth0" Jan 28 01:15:55.895154 containerd[1674]: 2026-01-28 01:15:55.877 [INFO][4648] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="9f1b9f625b1dafcabba6bf58eeca58ade6520f4ec91516dead4b7a17bc5be041" Namespace="kube-system" Pod="coredns-674b8bbfcf-24fxv" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-coredns--674b8bbfcf--24fxv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--n--62761e1650-k8s-coredns--674b8bbfcf--24fxv-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"999bf8e1-3bd4-4ef0-8db0-79618ac53f0b", ResourceVersion:"810", Generation:0, CreationTimestamp:time.Date(2026, time.January, 28, 1, 15, 18, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-n-62761e1650", ContainerID:"9f1b9f625b1dafcabba6bf58eeca58ade6520f4ec91516dead4b7a17bc5be041", Pod:"coredns-674b8bbfcf-24fxv", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.80.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calid6a943a0979", MAC:"6e:5a:ab:2e:f0:03", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 28 01:15:55.895154 containerd[1674]: 2026-01-28 01:15:55.891 [INFO][4648] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="9f1b9f625b1dafcabba6bf58eeca58ade6520f4ec91516dead4b7a17bc5be041" Namespace="kube-system" Pod="coredns-674b8bbfcf-24fxv" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-coredns--674b8bbfcf--24fxv-eth0" Jan 28 01:15:55.907000 audit[4676]: NETFILTER_CFG table=filter:129 family=2 entries=54 op=nft_register_chain pid=4676 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 28 01:15:55.907000 audit[4676]: SYSCALL arch=c000003e syscall=46 success=yes exit=25572 a0=3 a1=7ffd01c06640 a2=0 a3=7ffd01c0662c items=0 ppid=4076 pid=4676 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:55.907000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 28 01:15:55.926307 containerd[1674]: time="2026-01-28T01:15:55.926124959Z" level=info msg="connecting to shim 9f1b9f625b1dafcabba6bf58eeca58ade6520f4ec91516dead4b7a17bc5be041" address="unix:///run/containerd/s/b73acd41b232bc2fcd42aec501599ad705a93d9955520bb3cef15961aa5c4cfd" namespace=k8s.io protocol=ttrpc version=3 Jan 28 01:15:55.945022 kubelet[2910]: E0128 01:15:55.943448 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76f584f9b9-9mjk9" podUID="7ea90b44-fc7d-4702-a1a5-1c558b3ecd80" Jan 28 01:15:55.954199 kubelet[2910]: I0128 01:15:55.953867 2910 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-cxbk5" podStartSLOduration=37.953850584 podStartE2EDuration="37.953850584s" podCreationTimestamp="2026-01-28 01:15:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 01:15:55.952757125 +0000 UTC m=+43.308608610" watchObservedRunningTime="2026-01-28 01:15:55.953850584 +0000 UTC m=+43.309702041" Jan 28 01:15:55.977278 systemd[1]: Started cri-containerd-9f1b9f625b1dafcabba6bf58eeca58ade6520f4ec91516dead4b7a17bc5be041.scope - libcontainer container 9f1b9f625b1dafcabba6bf58eeca58ade6520f4ec91516dead4b7a17bc5be041. Jan 28 01:15:56.001000 audit[4712]: NETFILTER_CFG table=filter:130 family=2 entries=20 op=nft_register_rule pid=4712 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:15:56.001000 audit[4712]: SYSCALL arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7ffd51bc2030 a2=0 a3=7ffd51bc201c items=0 ppid=3019 pid=4712 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:56.001000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:15:56.008000 audit[4712]: NETFILTER_CFG table=nat:131 family=2 entries=14 op=nft_register_rule pid=4712 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:15:56.008000 audit[4712]: SYSCALL arch=c000003e syscall=46 success=yes exit=3468 a0=3 a1=7ffd51bc2030 a2=0 a3=0 items=0 ppid=3019 pid=4712 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:56.008000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:15:56.009000 audit: BPF prog-id=236 op=LOAD Jan 28 01:15:56.010000 audit: BPF prog-id=237 op=LOAD Jan 28 01:15:56.010000 audit[4697]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=4686 pid=4697 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:56.010000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3966316239663632356231646166636162626136626635386565636135 Jan 28 01:15:56.011000 audit: BPF prog-id=237 op=UNLOAD Jan 28 01:15:56.011000 audit[4697]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4686 pid=4697 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:56.011000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3966316239663632356231646166636162626136626635386565636135 Jan 28 01:15:56.011000 audit: BPF prog-id=238 op=LOAD Jan 28 01:15:56.011000 audit[4697]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=4686 pid=4697 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:56.011000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3966316239663632356231646166636162626136626635386565636135 Jan 28 01:15:56.011000 audit: BPF prog-id=239 op=LOAD Jan 28 01:15:56.011000 audit[4697]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=4686 pid=4697 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:56.011000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3966316239663632356231646166636162626136626635386565636135 Jan 28 01:15:56.011000 audit: BPF prog-id=239 op=UNLOAD Jan 28 01:15:56.011000 audit[4697]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4686 pid=4697 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:56.011000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3966316239663632356231646166636162626136626635386565636135 Jan 28 01:15:56.011000 audit: BPF prog-id=238 op=UNLOAD Jan 28 01:15:56.011000 audit[4697]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4686 pid=4697 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:56.011000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3966316239663632356231646166636162626136626635386565636135 Jan 28 01:15:56.012000 audit: BPF prog-id=240 op=LOAD Jan 28 01:15:56.012000 audit[4697]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=4686 pid=4697 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:56.012000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3966316239663632356231646166636162626136626635386565636135 Jan 28 01:15:56.029000 audit[4720]: NETFILTER_CFG table=filter:132 family=2 entries=17 op=nft_register_rule pid=4720 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:15:56.029000 audit[4720]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7ffea1b49a40 a2=0 a3=7ffea1b49a2c items=0 ppid=3019 pid=4720 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:56.029000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:15:56.036000 audit[4720]: NETFILTER_CFG table=nat:133 family=2 entries=35 op=nft_register_chain pid=4720 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:15:56.036000 audit[4720]: SYSCALL arch=c000003e syscall=46 success=yes exit=14196 a0=3 a1=7ffea1b49a40 a2=0 a3=7ffea1b49a2c items=0 ppid=3019 pid=4720 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:56.036000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:15:56.059341 containerd[1674]: time="2026-01-28T01:15:56.059247443Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-24fxv,Uid:999bf8e1-3bd4-4ef0-8db0-79618ac53f0b,Namespace:kube-system,Attempt:0,} returns sandbox id \"9f1b9f625b1dafcabba6bf58eeca58ade6520f4ec91516dead4b7a17bc5be041\"" Jan 28 01:15:56.064537 containerd[1674]: time="2026-01-28T01:15:56.064474275Z" level=info msg="CreateContainer within sandbox \"9f1b9f625b1dafcabba6bf58eeca58ade6520f4ec91516dead4b7a17bc5be041\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 28 01:15:56.078664 containerd[1674]: time="2026-01-28T01:15:56.078617173Z" level=info msg="Container d6db845d642c3f1578eafe304c834cf05ec7d0fa9b2c6b25ca5218454363d8be: CDI devices from CRI Config.CDIDevices: []" Jan 28 01:15:56.082596 containerd[1674]: time="2026-01-28T01:15:56.082546911Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:15:56.085879 containerd[1674]: time="2026-01-28T01:15:56.085792176Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 28 01:15:56.085879 containerd[1674]: time="2026-01-28T01:15:56.085857016Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 28 01:15:56.086769 containerd[1674]: time="2026-01-28T01:15:56.086436610Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 28 01:15:56.086798 kubelet[2910]: E0128 01:15:56.086000 2910 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 01:15:56.086798 kubelet[2910]: E0128 01:15:56.086090 2910 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 01:15:56.086798 kubelet[2910]: E0128 01:15:56.086588 2910 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7fx89,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-76f584f9b9-6dh2c_calico-apiserver(bead6395-8434-48df-aa67-e987782da70c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 28 01:15:56.087811 kubelet[2910]: E0128 01:15:56.087761 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76f584f9b9-6dh2c" podUID="bead6395-8434-48df-aa67-e987782da70c" Jan 28 01:15:56.088792 containerd[1674]: time="2026-01-28T01:15:56.088648360Z" level=info msg="CreateContainer within sandbox \"9f1b9f625b1dafcabba6bf58eeca58ade6520f4ec91516dead4b7a17bc5be041\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"d6db845d642c3f1578eafe304c834cf05ec7d0fa9b2c6b25ca5218454363d8be\"" Jan 28 01:15:56.089957 containerd[1674]: time="2026-01-28T01:15:56.089251169Z" level=info msg="StartContainer for \"d6db845d642c3f1578eafe304c834cf05ec7d0fa9b2c6b25ca5218454363d8be\"" Jan 28 01:15:56.089957 containerd[1674]: time="2026-01-28T01:15:56.089868370Z" level=info msg="connecting to shim d6db845d642c3f1578eafe304c834cf05ec7d0fa9b2c6b25ca5218454363d8be" address="unix:///run/containerd/s/b73acd41b232bc2fcd42aec501599ad705a93d9955520bb3cef15961aa5c4cfd" protocol=ttrpc version=3 Jan 28 01:15:56.114292 systemd[1]: Started cri-containerd-d6db845d642c3f1578eafe304c834cf05ec7d0fa9b2c6b25ca5218454363d8be.scope - libcontainer container d6db845d642c3f1578eafe304c834cf05ec7d0fa9b2c6b25ca5218454363d8be. Jan 28 01:15:56.126000 audit: BPF prog-id=241 op=LOAD Jan 28 01:15:56.126000 audit: BPF prog-id=242 op=LOAD Jan 28 01:15:56.126000 audit[4728]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000106238 a2=98 a3=0 items=0 ppid=4686 pid=4728 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:56.126000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6436646238343564363432633366313537386561666533303463383334 Jan 28 01:15:56.126000 audit: BPF prog-id=242 op=UNLOAD Jan 28 01:15:56.126000 audit[4728]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4686 pid=4728 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:56.126000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6436646238343564363432633366313537386561666533303463383334 Jan 28 01:15:56.126000 audit: BPF prog-id=243 op=LOAD Jan 28 01:15:56.126000 audit[4728]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000106488 a2=98 a3=0 items=0 ppid=4686 pid=4728 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:56.126000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6436646238343564363432633366313537386561666533303463383334 Jan 28 01:15:56.126000 audit: BPF prog-id=244 op=LOAD Jan 28 01:15:56.126000 audit[4728]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000106218 a2=98 a3=0 items=0 ppid=4686 pid=4728 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:56.126000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6436646238343564363432633366313537386561666533303463383334 Jan 28 01:15:56.126000 audit: BPF prog-id=244 op=UNLOAD Jan 28 01:15:56.126000 audit[4728]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4686 pid=4728 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:56.126000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6436646238343564363432633366313537386561666533303463383334 Jan 28 01:15:56.126000 audit: BPF prog-id=243 op=UNLOAD Jan 28 01:15:56.126000 audit[4728]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4686 pid=4728 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:56.126000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6436646238343564363432633366313537386561666533303463383334 Jan 28 01:15:56.126000 audit: BPF prog-id=245 op=LOAD Jan 28 01:15:56.126000 audit[4728]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001066e8 a2=98 a3=0 items=0 ppid=4686 pid=4728 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:56.126000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6436646238343564363432633366313537386561666533303463383334 Jan 28 01:15:56.145296 containerd[1674]: time="2026-01-28T01:15:56.145232579Z" level=info msg="StartContainer for \"d6db845d642c3f1578eafe304c834cf05ec7d0fa9b2c6b25ca5218454363d8be\" returns successfully" Jan 28 01:15:56.402303 systemd-networkd[1562]: cali0667187feaa: Gained IPv6LL Jan 28 01:15:56.419837 containerd[1674]: time="2026-01-28T01:15:56.419671943Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:15:56.421671 containerd[1674]: time="2026-01-28T01:15:56.421555661Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 28 01:15:56.421671 containerd[1674]: time="2026-01-28T01:15:56.421640134Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 28 01:15:56.421947 kubelet[2910]: E0128 01:15:56.421887 2910 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 28 01:15:56.421947 kubelet[2910]: E0128 01:15:56.421931 2910 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 28 01:15:56.422281 kubelet[2910]: E0128 01:15:56.422238 2910 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tqql6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-cblpt_calico-system(c22de3ae-0a27-443f-9dd3-c4ab0a4176bd): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 28 01:15:56.423633 kubelet[2910]: E0128 01:15:56.423595 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-cblpt" podUID="c22de3ae-0a27-443f-9dd3-c4ab0a4176bd" Jan 28 01:15:56.658904 systemd-networkd[1562]: cali9b0cb966255: Gained IPv6LL Jan 28 01:15:56.748049 containerd[1674]: time="2026-01-28T01:15:56.747822562Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-79f9cd9ddf-ggx6z,Uid:72726466-f235-4a31-a84a-a3699d8c85f7,Namespace:calico-system,Attempt:0,}" Jan 28 01:15:56.851750 systemd-networkd[1562]: calid0363403551: Gained IPv6LL Jan 28 01:15:56.865918 systemd-networkd[1562]: cali14f37ca30fc: Link UP Jan 28 01:15:56.867470 systemd-networkd[1562]: cali14f37ca30fc: Gained carrier Jan 28 01:15:56.886354 containerd[1674]: 2026-01-28 01:15:56.789 [INFO][4760] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4593--0--0--n--62761e1650-k8s-calico--kube--controllers--79f9cd9ddf--ggx6z-eth0 calico-kube-controllers-79f9cd9ddf- calico-system 72726466-f235-4a31-a84a-a3699d8c85f7 816 0 2026-01-28 01:15:33 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:79f9cd9ddf projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4593-0-0-n-62761e1650 calico-kube-controllers-79f9cd9ddf-ggx6z eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali14f37ca30fc [] [] }} ContainerID="c663952a73613bc832cf25929a4d580d56c07a8008f4bbbb623d1a0fac2a780a" Namespace="calico-system" Pod="calico-kube-controllers-79f9cd9ddf-ggx6z" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-calico--kube--controllers--79f9cd9ddf--ggx6z-" Jan 28 01:15:56.886354 containerd[1674]: 2026-01-28 01:15:56.789 [INFO][4760] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c663952a73613bc832cf25929a4d580d56c07a8008f4bbbb623d1a0fac2a780a" Namespace="calico-system" Pod="calico-kube-controllers-79f9cd9ddf-ggx6z" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-calico--kube--controllers--79f9cd9ddf--ggx6z-eth0" Jan 28 01:15:56.886354 containerd[1674]: 2026-01-28 01:15:56.820 [INFO][4772] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c663952a73613bc832cf25929a4d580d56c07a8008f4bbbb623d1a0fac2a780a" HandleID="k8s-pod-network.c663952a73613bc832cf25929a4d580d56c07a8008f4bbbb623d1a0fac2a780a" Workload="ci--4593--0--0--n--62761e1650-k8s-calico--kube--controllers--79f9cd9ddf--ggx6z-eth0" Jan 28 01:15:56.886354 containerd[1674]: 2026-01-28 01:15:56.820 [INFO][4772] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="c663952a73613bc832cf25929a4d580d56c07a8008f4bbbb623d1a0fac2a780a" HandleID="k8s-pod-network.c663952a73613bc832cf25929a4d580d56c07a8008f4bbbb623d1a0fac2a780a" Workload="ci--4593--0--0--n--62761e1650-k8s-calico--kube--controllers--79f9cd9ddf--ggx6z-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002cb810), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4593-0-0-n-62761e1650", "pod":"calico-kube-controllers-79f9cd9ddf-ggx6z", "timestamp":"2026-01-28 01:15:56.82029209 +0000 UTC"}, Hostname:"ci-4593-0-0-n-62761e1650", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 28 01:15:56.886354 containerd[1674]: 2026-01-28 01:15:56.820 [INFO][4772] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 28 01:15:56.886354 containerd[1674]: 2026-01-28 01:15:56.820 [INFO][4772] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 28 01:15:56.886354 containerd[1674]: 2026-01-28 01:15:56.820 [INFO][4772] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4593-0-0-n-62761e1650' Jan 28 01:15:56.886354 containerd[1674]: 2026-01-28 01:15:56.829 [INFO][4772] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c663952a73613bc832cf25929a4d580d56c07a8008f4bbbb623d1a0fac2a780a" host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:56.886354 containerd[1674]: 2026-01-28 01:15:56.833 [INFO][4772] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:56.886354 containerd[1674]: 2026-01-28 01:15:56.837 [INFO][4772] ipam/ipam.go 511: Trying affinity for 192.168.80.192/26 host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:56.886354 containerd[1674]: 2026-01-28 01:15:56.841 [INFO][4772] ipam/ipam.go 158: Attempting to load block cidr=192.168.80.192/26 host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:56.886354 containerd[1674]: 2026-01-28 01:15:56.843 [INFO][4772] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.80.192/26 host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:56.886354 containerd[1674]: 2026-01-28 01:15:56.843 [INFO][4772] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.80.192/26 handle="k8s-pod-network.c663952a73613bc832cf25929a4d580d56c07a8008f4bbbb623d1a0fac2a780a" host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:56.886354 containerd[1674]: 2026-01-28 01:15:56.845 [INFO][4772] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.c663952a73613bc832cf25929a4d580d56c07a8008f4bbbb623d1a0fac2a780a Jan 28 01:15:56.886354 containerd[1674]: 2026-01-28 01:15:56.853 [INFO][4772] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.80.192/26 handle="k8s-pod-network.c663952a73613bc832cf25929a4d580d56c07a8008f4bbbb623d1a0fac2a780a" host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:56.886354 containerd[1674]: 2026-01-28 01:15:56.859 [INFO][4772] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.80.199/26] block=192.168.80.192/26 handle="k8s-pod-network.c663952a73613bc832cf25929a4d580d56c07a8008f4bbbb623d1a0fac2a780a" host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:56.886354 containerd[1674]: 2026-01-28 01:15:56.860 [INFO][4772] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.80.199/26] handle="k8s-pod-network.c663952a73613bc832cf25929a4d580d56c07a8008f4bbbb623d1a0fac2a780a" host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:56.886354 containerd[1674]: 2026-01-28 01:15:56.860 [INFO][4772] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 28 01:15:56.886354 containerd[1674]: 2026-01-28 01:15:56.860 [INFO][4772] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.80.199/26] IPv6=[] ContainerID="c663952a73613bc832cf25929a4d580d56c07a8008f4bbbb623d1a0fac2a780a" HandleID="k8s-pod-network.c663952a73613bc832cf25929a4d580d56c07a8008f4bbbb623d1a0fac2a780a" Workload="ci--4593--0--0--n--62761e1650-k8s-calico--kube--controllers--79f9cd9ddf--ggx6z-eth0" Jan 28 01:15:56.887387 containerd[1674]: 2026-01-28 01:15:56.861 [INFO][4760] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c663952a73613bc832cf25929a4d580d56c07a8008f4bbbb623d1a0fac2a780a" Namespace="calico-system" Pod="calico-kube-controllers-79f9cd9ddf-ggx6z" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-calico--kube--controllers--79f9cd9ddf--ggx6z-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--n--62761e1650-k8s-calico--kube--controllers--79f9cd9ddf--ggx6z-eth0", GenerateName:"calico-kube-controllers-79f9cd9ddf-", Namespace:"calico-system", SelfLink:"", UID:"72726466-f235-4a31-a84a-a3699d8c85f7", ResourceVersion:"816", Generation:0, CreationTimestamp:time.Date(2026, time.January, 28, 1, 15, 33, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"79f9cd9ddf", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-n-62761e1650", ContainerID:"", Pod:"calico-kube-controllers-79f9cd9ddf-ggx6z", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.80.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali14f37ca30fc", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 28 01:15:56.887387 containerd[1674]: 2026-01-28 01:15:56.861 [INFO][4760] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.80.199/32] ContainerID="c663952a73613bc832cf25929a4d580d56c07a8008f4bbbb623d1a0fac2a780a" Namespace="calico-system" Pod="calico-kube-controllers-79f9cd9ddf-ggx6z" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-calico--kube--controllers--79f9cd9ddf--ggx6z-eth0" Jan 28 01:15:56.887387 containerd[1674]: 2026-01-28 01:15:56.861 [INFO][4760] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali14f37ca30fc ContainerID="c663952a73613bc832cf25929a4d580d56c07a8008f4bbbb623d1a0fac2a780a" Namespace="calico-system" Pod="calico-kube-controllers-79f9cd9ddf-ggx6z" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-calico--kube--controllers--79f9cd9ddf--ggx6z-eth0" Jan 28 01:15:56.887387 containerd[1674]: 2026-01-28 01:15:56.867 [INFO][4760] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c663952a73613bc832cf25929a4d580d56c07a8008f4bbbb623d1a0fac2a780a" Namespace="calico-system" Pod="calico-kube-controllers-79f9cd9ddf-ggx6z" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-calico--kube--controllers--79f9cd9ddf--ggx6z-eth0" Jan 28 01:15:56.887387 containerd[1674]: 2026-01-28 01:15:56.867 [INFO][4760] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c663952a73613bc832cf25929a4d580d56c07a8008f4bbbb623d1a0fac2a780a" Namespace="calico-system" Pod="calico-kube-controllers-79f9cd9ddf-ggx6z" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-calico--kube--controllers--79f9cd9ddf--ggx6z-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--n--62761e1650-k8s-calico--kube--controllers--79f9cd9ddf--ggx6z-eth0", GenerateName:"calico-kube-controllers-79f9cd9ddf-", Namespace:"calico-system", SelfLink:"", UID:"72726466-f235-4a31-a84a-a3699d8c85f7", ResourceVersion:"816", Generation:0, CreationTimestamp:time.Date(2026, time.January, 28, 1, 15, 33, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"79f9cd9ddf", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-n-62761e1650", ContainerID:"c663952a73613bc832cf25929a4d580d56c07a8008f4bbbb623d1a0fac2a780a", Pod:"calico-kube-controllers-79f9cd9ddf-ggx6z", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.80.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali14f37ca30fc", MAC:"4e:69:68:fc:74:31", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 28 01:15:56.887387 containerd[1674]: 2026-01-28 01:15:56.881 [INFO][4760] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c663952a73613bc832cf25929a4d580d56c07a8008f4bbbb623d1a0fac2a780a" Namespace="calico-system" Pod="calico-kube-controllers-79f9cd9ddf-ggx6z" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-calico--kube--controllers--79f9cd9ddf--ggx6z-eth0" Jan 28 01:15:56.899000 audit[4786]: NETFILTER_CFG table=filter:134 family=2 entries=58 op=nft_register_chain pid=4786 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 28 01:15:56.901473 kernel: kauditd_printk_skb: 412 callbacks suppressed Jan 28 01:15:56.901527 kernel: audit: type=1325 audit(1769562956.899:723): table=filter:134 family=2 entries=58 op=nft_register_chain pid=4786 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 28 01:15:56.899000 audit[4786]: SYSCALL arch=c000003e syscall=46 success=yes exit=27164 a0=3 a1=7ffe51aee710 a2=0 a3=7ffe51aee6fc items=0 ppid=4076 pid=4786 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:56.904477 kernel: audit: type=1300 audit(1769562956.899:723): arch=c000003e syscall=46 success=yes exit=27164 a0=3 a1=7ffe51aee710 a2=0 a3=7ffe51aee6fc items=0 ppid=4076 pid=4786 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:56.899000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 28 01:15:56.909768 kernel: audit: type=1327 audit(1769562956.899:723): proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 28 01:15:56.914349 systemd-networkd[1562]: calic611e4593bd: Gained IPv6LL Jan 28 01:15:56.928148 containerd[1674]: time="2026-01-28T01:15:56.928083736Z" level=info msg="connecting to shim c663952a73613bc832cf25929a4d580d56c07a8008f4bbbb623d1a0fac2a780a" address="unix:///run/containerd/s/7d7f9314a1fe5a947f91327274f23494bd9f587eb7e18415b2cf4095b1d79414" namespace=k8s.io protocol=ttrpc version=3 Jan 28 01:15:56.946227 kubelet[2910]: E0128 01:15:56.946111 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76f584f9b9-6dh2c" podUID="bead6395-8434-48df-aa67-e987782da70c" Jan 28 01:15:56.947862 kubelet[2910]: E0128 01:15:56.946813 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-cblpt" podUID="c22de3ae-0a27-443f-9dd3-c4ab0a4176bd" Jan 28 01:15:56.947862 kubelet[2910]: E0128 01:15:56.947370 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76f584f9b9-9mjk9" podUID="7ea90b44-fc7d-4702-a1a5-1c558b3ecd80" Jan 28 01:15:56.964560 systemd[1]: Started cri-containerd-c663952a73613bc832cf25929a4d580d56c07a8008f4bbbb623d1a0fac2a780a.scope - libcontainer container c663952a73613bc832cf25929a4d580d56c07a8008f4bbbb623d1a0fac2a780a. Jan 28 01:15:56.988000 audit: BPF prog-id=246 op=LOAD Jan 28 01:15:56.991043 kernel: audit: type=1334 audit(1769562956.988:724): prog-id=246 op=LOAD Jan 28 01:15:56.989000 audit: BPF prog-id=247 op=LOAD Jan 28 01:15:56.993027 kernel: audit: type=1334 audit(1769562956.989:725): prog-id=247 op=LOAD Jan 28 01:15:56.989000 audit[4806]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=4795 pid=4806 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:56.997028 kernel: audit: type=1300 audit(1769562956.989:725): arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=4795 pid=4806 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:56.989000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6336363339353261373336313362633833326366323539323961346435 Jan 28 01:15:57.004025 kernel: audit: type=1327 audit(1769562956.989:725): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6336363339353261373336313362633833326366323539323961346435 Jan 28 01:15:56.989000 audit: BPF prog-id=247 op=UNLOAD Jan 28 01:15:57.007019 kernel: audit: type=1334 audit(1769562956.989:726): prog-id=247 op=UNLOAD Jan 28 01:15:57.011467 kernel: audit: type=1300 audit(1769562956.989:726): arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4795 pid=4806 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:56.989000 audit[4806]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4795 pid=4806 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:56.989000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6336363339353261373336313362633833326366323539323961346435 Jan 28 01:15:57.018033 kernel: audit: type=1327 audit(1769562956.989:726): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6336363339353261373336313362633833326366323539323961346435 Jan 28 01:15:56.989000 audit: BPF prog-id=248 op=LOAD Jan 28 01:15:56.989000 audit[4806]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=4795 pid=4806 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:56.989000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6336363339353261373336313362633833326366323539323961346435 Jan 28 01:15:56.990000 audit: BPF prog-id=249 op=LOAD Jan 28 01:15:56.990000 audit[4806]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=4795 pid=4806 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:56.990000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6336363339353261373336313362633833326366323539323961346435 Jan 28 01:15:56.990000 audit: BPF prog-id=249 op=UNLOAD Jan 28 01:15:56.990000 audit[4806]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4795 pid=4806 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:56.990000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6336363339353261373336313362633833326366323539323961346435 Jan 28 01:15:56.990000 audit: BPF prog-id=248 op=UNLOAD Jan 28 01:15:56.990000 audit[4806]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4795 pid=4806 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:56.990000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6336363339353261373336313362633833326366323539323961346435 Jan 28 01:15:56.990000 audit: BPF prog-id=250 op=LOAD Jan 28 01:15:56.990000 audit[4806]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=4795 pid=4806 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:56.990000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6336363339353261373336313362633833326366323539323961346435 Jan 28 01:15:57.009000 audit[4827]: NETFILTER_CFG table=filter:135 family=2 entries=14 op=nft_register_rule pid=4827 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:15:57.009000 audit[4827]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7ffcca472b60 a2=0 a3=7ffcca472b4c items=0 ppid=3019 pid=4827 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:57.009000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:15:57.020000 audit[4827]: NETFILTER_CFG table=nat:136 family=2 entries=20 op=nft_register_rule pid=4827 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:15:57.020000 audit[4827]: SYSCALL arch=c000003e syscall=46 success=yes exit=5772 a0=3 a1=7ffcca472b60 a2=0 a3=7ffcca472b4c items=0 ppid=3019 pid=4827 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:57.020000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:15:57.039051 kubelet[2910]: I0128 01:15:57.038902 2910 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-24fxv" podStartSLOduration=39.038887474 podStartE2EDuration="39.038887474s" podCreationTimestamp="2026-01-28 01:15:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 01:15:57.018309451 +0000 UTC m=+44.374160919" watchObservedRunningTime="2026-01-28 01:15:57.038887474 +0000 UTC m=+44.394738940" Jan 28 01:15:57.064938 containerd[1674]: time="2026-01-28T01:15:57.064898922Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-79f9cd9ddf-ggx6z,Uid:72726466-f235-4a31-a84a-a3699d8c85f7,Namespace:calico-system,Attempt:0,} returns sandbox id \"c663952a73613bc832cf25929a4d580d56c07a8008f4bbbb623d1a0fac2a780a\"" Jan 28 01:15:57.066954 containerd[1674]: time="2026-01-28T01:15:57.066934731Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 28 01:15:57.401820 containerd[1674]: time="2026-01-28T01:15:57.401775296Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:15:57.404288 containerd[1674]: time="2026-01-28T01:15:57.404258634Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 28 01:15:57.404657 containerd[1674]: time="2026-01-28T01:15:57.404336344Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 28 01:15:57.404714 kubelet[2910]: E0128 01:15:57.404497 2910 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 28 01:15:57.404714 kubelet[2910]: E0128 01:15:57.404566 2910 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 28 01:15:57.405145 kubelet[2910]: E0128 01:15:57.404975 2910 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8nnlh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-79f9cd9ddf-ggx6z_calico-system(72726466-f235-4a31-a84a-a3699d8c85f7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 28 01:15:57.406280 kubelet[2910]: E0128 01:15:57.406246 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-79f9cd9ddf-ggx6z" podUID="72726466-f235-4a31-a84a-a3699d8c85f7" Jan 28 01:15:57.746530 systemd-networkd[1562]: calid6a943a0979: Gained IPv6LL Jan 28 01:15:57.947968 kubelet[2910]: E0128 01:15:57.947483 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-79f9cd9ddf-ggx6z" podUID="72726466-f235-4a31-a84a-a3699d8c85f7" Jan 28 01:15:58.032000 audit[4836]: NETFILTER_CFG table=filter:137 family=2 entries=14 op=nft_register_rule pid=4836 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:15:58.032000 audit[4836]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7ffd0744b760 a2=0 a3=7ffd0744b74c items=0 ppid=3019 pid=4836 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:58.032000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:15:58.054000 audit[4836]: NETFILTER_CFG table=nat:138 family=2 entries=56 op=nft_register_chain pid=4836 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:15:58.054000 audit[4836]: SYSCALL arch=c000003e syscall=46 success=yes exit=19860 a0=3 a1=7ffd0744b760 a2=0 a3=7ffd0744b74c items=0 ppid=3019 pid=4836 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:58.054000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:15:58.451153 systemd-networkd[1562]: cali14f37ca30fc: Gained IPv6LL Jan 28 01:15:58.949907 kubelet[2910]: E0128 01:15:58.949849 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-79f9cd9ddf-ggx6z" podUID="72726466-f235-4a31-a84a-a3699d8c85f7" Jan 28 01:15:59.748332 containerd[1674]: time="2026-01-28T01:15:59.748248294Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-lkr4f,Uid:e60412d5-27c3-4569-9b64-5743c10cc437,Namespace:calico-system,Attempt:0,}" Jan 28 01:15:59.867779 systemd-networkd[1562]: cali0cc84633dc0: Link UP Jan 28 01:15:59.869357 systemd-networkd[1562]: cali0cc84633dc0: Gained carrier Jan 28 01:15:59.889048 containerd[1674]: 2026-01-28 01:15:59.793 [INFO][4845] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4593--0--0--n--62761e1650-k8s-csi--node--driver--lkr4f-eth0 csi-node-driver- calico-system e60412d5-27c3-4569-9b64-5743c10cc437 710 0 2026-01-28 01:15:33 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:857b56db8f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4593-0-0-n-62761e1650 csi-node-driver-lkr4f eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali0cc84633dc0 [] [] }} ContainerID="6dade929cbc39afa6155a1385bb65a24b9be3b61da315e79e70ca84b393503aa" Namespace="calico-system" Pod="csi-node-driver-lkr4f" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-csi--node--driver--lkr4f-" Jan 28 01:15:59.889048 containerd[1674]: 2026-01-28 01:15:59.793 [INFO][4845] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="6dade929cbc39afa6155a1385bb65a24b9be3b61da315e79e70ca84b393503aa" Namespace="calico-system" Pod="csi-node-driver-lkr4f" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-csi--node--driver--lkr4f-eth0" Jan 28 01:15:59.889048 containerd[1674]: 2026-01-28 01:15:59.823 [INFO][4856] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6dade929cbc39afa6155a1385bb65a24b9be3b61da315e79e70ca84b393503aa" HandleID="k8s-pod-network.6dade929cbc39afa6155a1385bb65a24b9be3b61da315e79e70ca84b393503aa" Workload="ci--4593--0--0--n--62761e1650-k8s-csi--node--driver--lkr4f-eth0" Jan 28 01:15:59.889048 containerd[1674]: 2026-01-28 01:15:59.823 [INFO][4856] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="6dade929cbc39afa6155a1385bb65a24b9be3b61da315e79e70ca84b393503aa" HandleID="k8s-pod-network.6dade929cbc39afa6155a1385bb65a24b9be3b61da315e79e70ca84b393503aa" Workload="ci--4593--0--0--n--62761e1650-k8s-csi--node--driver--lkr4f-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002cb290), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4593-0-0-n-62761e1650", "pod":"csi-node-driver-lkr4f", "timestamp":"2026-01-28 01:15:59.823561496 +0000 UTC"}, Hostname:"ci-4593-0-0-n-62761e1650", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 28 01:15:59.889048 containerd[1674]: 2026-01-28 01:15:59.823 [INFO][4856] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 28 01:15:59.889048 containerd[1674]: 2026-01-28 01:15:59.823 [INFO][4856] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 28 01:15:59.889048 containerd[1674]: 2026-01-28 01:15:59.823 [INFO][4856] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4593-0-0-n-62761e1650' Jan 28 01:15:59.889048 containerd[1674]: 2026-01-28 01:15:59.831 [INFO][4856] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.6dade929cbc39afa6155a1385bb65a24b9be3b61da315e79e70ca84b393503aa" host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:59.889048 containerd[1674]: 2026-01-28 01:15:59.837 [INFO][4856] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:59.889048 containerd[1674]: 2026-01-28 01:15:59.841 [INFO][4856] ipam/ipam.go 511: Trying affinity for 192.168.80.192/26 host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:59.889048 containerd[1674]: 2026-01-28 01:15:59.844 [INFO][4856] ipam/ipam.go 158: Attempting to load block cidr=192.168.80.192/26 host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:59.889048 containerd[1674]: 2026-01-28 01:15:59.846 [INFO][4856] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.80.192/26 host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:59.889048 containerd[1674]: 2026-01-28 01:15:59.846 [INFO][4856] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.80.192/26 handle="k8s-pod-network.6dade929cbc39afa6155a1385bb65a24b9be3b61da315e79e70ca84b393503aa" host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:59.889048 containerd[1674]: 2026-01-28 01:15:59.848 [INFO][4856] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.6dade929cbc39afa6155a1385bb65a24b9be3b61da315e79e70ca84b393503aa Jan 28 01:15:59.889048 containerd[1674]: 2026-01-28 01:15:59.854 [INFO][4856] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.80.192/26 handle="k8s-pod-network.6dade929cbc39afa6155a1385bb65a24b9be3b61da315e79e70ca84b393503aa" host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:59.889048 containerd[1674]: 2026-01-28 01:15:59.861 [INFO][4856] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.80.200/26] block=192.168.80.192/26 handle="k8s-pod-network.6dade929cbc39afa6155a1385bb65a24b9be3b61da315e79e70ca84b393503aa" host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:59.889048 containerd[1674]: 2026-01-28 01:15:59.861 [INFO][4856] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.80.200/26] handle="k8s-pod-network.6dade929cbc39afa6155a1385bb65a24b9be3b61da315e79e70ca84b393503aa" host="ci-4593-0-0-n-62761e1650" Jan 28 01:15:59.889048 containerd[1674]: 2026-01-28 01:15:59.861 [INFO][4856] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 28 01:15:59.889048 containerd[1674]: 2026-01-28 01:15:59.861 [INFO][4856] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.80.200/26] IPv6=[] ContainerID="6dade929cbc39afa6155a1385bb65a24b9be3b61da315e79e70ca84b393503aa" HandleID="k8s-pod-network.6dade929cbc39afa6155a1385bb65a24b9be3b61da315e79e70ca84b393503aa" Workload="ci--4593--0--0--n--62761e1650-k8s-csi--node--driver--lkr4f-eth0" Jan 28 01:15:59.890365 containerd[1674]: 2026-01-28 01:15:59.863 [INFO][4845] cni-plugin/k8s.go 418: Populated endpoint ContainerID="6dade929cbc39afa6155a1385bb65a24b9be3b61da315e79e70ca84b393503aa" Namespace="calico-system" Pod="csi-node-driver-lkr4f" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-csi--node--driver--lkr4f-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--n--62761e1650-k8s-csi--node--driver--lkr4f-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"e60412d5-27c3-4569-9b64-5743c10cc437", ResourceVersion:"710", Generation:0, CreationTimestamp:time.Date(2026, time.January, 28, 1, 15, 33, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-n-62761e1650", ContainerID:"", Pod:"csi-node-driver-lkr4f", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.80.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali0cc84633dc0", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 28 01:15:59.890365 containerd[1674]: 2026-01-28 01:15:59.863 [INFO][4845] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.80.200/32] ContainerID="6dade929cbc39afa6155a1385bb65a24b9be3b61da315e79e70ca84b393503aa" Namespace="calico-system" Pod="csi-node-driver-lkr4f" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-csi--node--driver--lkr4f-eth0" Jan 28 01:15:59.890365 containerd[1674]: 2026-01-28 01:15:59.863 [INFO][4845] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0cc84633dc0 ContainerID="6dade929cbc39afa6155a1385bb65a24b9be3b61da315e79e70ca84b393503aa" Namespace="calico-system" Pod="csi-node-driver-lkr4f" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-csi--node--driver--lkr4f-eth0" Jan 28 01:15:59.890365 containerd[1674]: 2026-01-28 01:15:59.869 [INFO][4845] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6dade929cbc39afa6155a1385bb65a24b9be3b61da315e79e70ca84b393503aa" Namespace="calico-system" Pod="csi-node-driver-lkr4f" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-csi--node--driver--lkr4f-eth0" Jan 28 01:15:59.890365 containerd[1674]: 2026-01-28 01:15:59.870 [INFO][4845] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="6dade929cbc39afa6155a1385bb65a24b9be3b61da315e79e70ca84b393503aa" Namespace="calico-system" Pod="csi-node-driver-lkr4f" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-csi--node--driver--lkr4f-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--n--62761e1650-k8s-csi--node--driver--lkr4f-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"e60412d5-27c3-4569-9b64-5743c10cc437", ResourceVersion:"710", Generation:0, CreationTimestamp:time.Date(2026, time.January, 28, 1, 15, 33, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-n-62761e1650", ContainerID:"6dade929cbc39afa6155a1385bb65a24b9be3b61da315e79e70ca84b393503aa", Pod:"csi-node-driver-lkr4f", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.80.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali0cc84633dc0", MAC:"56:32:49:a3:1b:d3", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 28 01:15:59.890365 containerd[1674]: 2026-01-28 01:15:59.886 [INFO][4845] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="6dade929cbc39afa6155a1385bb65a24b9be3b61da315e79e70ca84b393503aa" Namespace="calico-system" Pod="csi-node-driver-lkr4f" WorkloadEndpoint="ci--4593--0--0--n--62761e1650-k8s-csi--node--driver--lkr4f-eth0" Jan 28 01:15:59.919000 audit[4870]: NETFILTER_CFG table=filter:139 family=2 entries=58 op=nft_register_chain pid=4870 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 28 01:15:59.919000 audit[4870]: SYSCALL arch=c000003e syscall=46 success=yes exit=27148 a0=3 a1=7fff5810bad0 a2=0 a3=7fff5810babc items=0 ppid=4076 pid=4870 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:59.919000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 28 01:15:59.926853 containerd[1674]: time="2026-01-28T01:15:59.926785126Z" level=info msg="connecting to shim 6dade929cbc39afa6155a1385bb65a24b9be3b61da315e79e70ca84b393503aa" address="unix:///run/containerd/s/98950a54135148b8eec442666ff7be24d9b4c1cde82fa9ab2513a8ac381f7053" namespace=k8s.io protocol=ttrpc version=3 Jan 28 01:15:59.955208 systemd[1]: Started cri-containerd-6dade929cbc39afa6155a1385bb65a24b9be3b61da315e79e70ca84b393503aa.scope - libcontainer container 6dade929cbc39afa6155a1385bb65a24b9be3b61da315e79e70ca84b393503aa. Jan 28 01:15:59.964000 audit: BPF prog-id=251 op=LOAD Jan 28 01:15:59.964000 audit: BPF prog-id=252 op=LOAD Jan 28 01:15:59.964000 audit[4893]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=4880 pid=4893 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:59.964000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3664616465393239636263333961666136313535613133383562623635 Jan 28 01:15:59.965000 audit: BPF prog-id=252 op=UNLOAD Jan 28 01:15:59.965000 audit[4893]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4880 pid=4893 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:59.965000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3664616465393239636263333961666136313535613133383562623635 Jan 28 01:15:59.965000 audit: BPF prog-id=253 op=LOAD Jan 28 01:15:59.965000 audit[4893]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=4880 pid=4893 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:59.965000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3664616465393239636263333961666136313535613133383562623635 Jan 28 01:15:59.965000 audit: BPF prog-id=254 op=LOAD Jan 28 01:15:59.965000 audit[4893]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=4880 pid=4893 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:59.965000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3664616465393239636263333961666136313535613133383562623635 Jan 28 01:15:59.965000 audit: BPF prog-id=254 op=UNLOAD Jan 28 01:15:59.965000 audit[4893]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4880 pid=4893 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:59.965000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3664616465393239636263333961666136313535613133383562623635 Jan 28 01:15:59.965000 audit: BPF prog-id=253 op=UNLOAD Jan 28 01:15:59.965000 audit[4893]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4880 pid=4893 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:59.965000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3664616465393239636263333961666136313535613133383562623635 Jan 28 01:15:59.965000 audit: BPF prog-id=255 op=LOAD Jan 28 01:15:59.965000 audit[4893]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=4880 pid=4893 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:15:59.965000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3664616465393239636263333961666136313535613133383562623635 Jan 28 01:15:59.983065 containerd[1674]: time="2026-01-28T01:15:59.982975934Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-lkr4f,Uid:e60412d5-27c3-4569-9b64-5743c10cc437,Namespace:calico-system,Attempt:0,} returns sandbox id \"6dade929cbc39afa6155a1385bb65a24b9be3b61da315e79e70ca84b393503aa\"" Jan 28 01:15:59.984827 containerd[1674]: time="2026-01-28T01:15:59.984798474Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 28 01:16:00.318855 containerd[1674]: time="2026-01-28T01:16:00.318812115Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:16:00.320814 containerd[1674]: time="2026-01-28T01:16:00.320727665Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 28 01:16:00.320814 containerd[1674]: time="2026-01-28T01:16:00.320784110Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 28 01:16:00.321140 kubelet[2910]: E0128 01:16:00.321096 2910 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 28 01:16:00.321411 kubelet[2910]: E0128 01:16:00.321168 2910 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 28 01:16:00.323307 kubelet[2910]: E0128 01:16:00.323246 2910 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zgfdf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-lkr4f_calico-system(e60412d5-27c3-4569-9b64-5743c10cc437): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 28 01:16:00.325820 containerd[1674]: time="2026-01-28T01:16:00.325618136Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 28 01:16:00.653493 containerd[1674]: time="2026-01-28T01:16:00.653447918Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:16:00.655775 containerd[1674]: time="2026-01-28T01:16:00.655731578Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 28 01:16:00.655850 containerd[1674]: time="2026-01-28T01:16:00.655820696Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 28 01:16:00.656011 kubelet[2910]: E0128 01:16:00.655969 2910 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 28 01:16:00.656053 kubelet[2910]: E0128 01:16:00.656042 2910 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 28 01:16:00.656383 kubelet[2910]: E0128 01:16:00.656160 2910 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zgfdf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-lkr4f_calico-system(e60412d5-27c3-4569-9b64-5743c10cc437): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 28 01:16:00.657399 kubelet[2910]: E0128 01:16:00.657368 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-lkr4f" podUID="e60412d5-27c3-4569-9b64-5743c10cc437" Jan 28 01:16:00.953862 kubelet[2910]: E0128 01:16:00.953696 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-lkr4f" podUID="e60412d5-27c3-4569-9b64-5743c10cc437" Jan 28 01:16:01.010358 systemd-networkd[1562]: cali0cc84633dc0: Gained IPv6LL Jan 28 01:16:01.956086 kubelet[2910]: E0128 01:16:01.955996 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-lkr4f" podUID="e60412d5-27c3-4569-9b64-5743c10cc437" Jan 28 01:16:08.749366 containerd[1674]: time="2026-01-28T01:16:08.749120495Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 28 01:16:09.081761 containerd[1674]: time="2026-01-28T01:16:09.081641817Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:16:09.083469 containerd[1674]: time="2026-01-28T01:16:09.083407459Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 28 01:16:09.083469 containerd[1674]: time="2026-01-28T01:16:09.083439252Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 28 01:16:09.083617 kubelet[2910]: E0128 01:16:09.083581 2910 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 01:16:09.083926 kubelet[2910]: E0128 01:16:09.083626 2910 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 01:16:09.083926 kubelet[2910]: E0128 01:16:09.083900 2910 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zgdzc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-76f584f9b9-9mjk9_calico-apiserver(7ea90b44-fc7d-4702-a1a5-1c558b3ecd80): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 28 01:16:09.084260 containerd[1674]: time="2026-01-28T01:16:09.084244631Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 28 01:16:09.085382 kubelet[2910]: E0128 01:16:09.085346 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76f584f9b9-9mjk9" podUID="7ea90b44-fc7d-4702-a1a5-1c558b3ecd80" Jan 28 01:16:09.423679 containerd[1674]: time="2026-01-28T01:16:09.423527718Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:16:09.425537 containerd[1674]: time="2026-01-28T01:16:09.425435435Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 28 01:16:09.425537 containerd[1674]: time="2026-01-28T01:16:09.425509237Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 28 01:16:09.425728 kubelet[2910]: E0128 01:16:09.425668 2910 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 28 01:16:09.425728 kubelet[2910]: E0128 01:16:09.425710 2910 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 28 01:16:09.426001 kubelet[2910]: E0128 01:16:09.425964 2910 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:8d1c1aed77714da78e5314e3a5e614ef,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4h8tc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-b767f6fff-w2bfk_calico-system(3934179d-fb13-45a1-9643-cbd7ec08e773): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 28 01:16:09.428594 containerd[1674]: time="2026-01-28T01:16:09.428570095Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 28 01:16:09.769285 containerd[1674]: time="2026-01-28T01:16:09.769122385Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:16:09.771443 containerd[1674]: time="2026-01-28T01:16:09.771363115Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 28 01:16:09.771443 containerd[1674]: time="2026-01-28T01:16:09.771405397Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 28 01:16:09.771748 kubelet[2910]: E0128 01:16:09.771699 2910 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 28 01:16:09.771806 kubelet[2910]: E0128 01:16:09.771761 2910 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 28 01:16:09.772324 kubelet[2910]: E0128 01:16:09.772255 2910 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4h8tc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-b767f6fff-w2bfk_calico-system(3934179d-fb13-45a1-9643-cbd7ec08e773): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 28 01:16:09.773468 kubelet[2910]: E0128 01:16:09.773418 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-b767f6fff-w2bfk" podUID="3934179d-fb13-45a1-9643-cbd7ec08e773" Jan 28 01:16:11.754067 containerd[1674]: time="2026-01-28T01:16:11.752795083Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 28 01:16:12.096644 containerd[1674]: time="2026-01-28T01:16:12.096598703Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:16:12.098758 containerd[1674]: time="2026-01-28T01:16:12.098729601Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 28 01:16:12.098821 containerd[1674]: time="2026-01-28T01:16:12.098807074Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 28 01:16:12.099002 kubelet[2910]: E0128 01:16:12.098962 2910 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 28 01:16:12.099287 kubelet[2910]: E0128 01:16:12.099019 2910 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 28 01:16:12.099287 kubelet[2910]: E0128 01:16:12.099223 2910 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tqql6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-cblpt_calico-system(c22de3ae-0a27-443f-9dd3-c4ab0a4176bd): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 28 01:16:12.100409 kubelet[2910]: E0128 01:16:12.100356 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-cblpt" podUID="c22de3ae-0a27-443f-9dd3-c4ab0a4176bd" Jan 28 01:16:12.100551 containerd[1674]: time="2026-01-28T01:16:12.100502605Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 28 01:16:12.438745 containerd[1674]: time="2026-01-28T01:16:12.438480882Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:16:12.441969 containerd[1674]: time="2026-01-28T01:16:12.441818293Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 28 01:16:12.441969 containerd[1674]: time="2026-01-28T01:16:12.441937236Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 28 01:16:12.442483 kubelet[2910]: E0128 01:16:12.442376 2910 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 28 01:16:12.442536 kubelet[2910]: E0128 01:16:12.442501 2910 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 28 01:16:12.442855 kubelet[2910]: E0128 01:16:12.442761 2910 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8nnlh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-79f9cd9ddf-ggx6z_calico-system(72726466-f235-4a31-a84a-a3699d8c85f7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 28 01:16:12.444354 kubelet[2910]: E0128 01:16:12.444252 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-79f9cd9ddf-ggx6z" podUID="72726466-f235-4a31-a84a-a3699d8c85f7" Jan 28 01:16:12.750052 containerd[1674]: time="2026-01-28T01:16:12.749906634Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 28 01:16:13.103234 containerd[1674]: time="2026-01-28T01:16:13.103196522Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:16:13.104880 containerd[1674]: time="2026-01-28T01:16:13.104846370Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 28 01:16:13.104952 containerd[1674]: time="2026-01-28T01:16:13.104925974Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 28 01:16:13.105099 kubelet[2910]: E0128 01:16:13.105070 2910 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 01:16:13.105370 kubelet[2910]: E0128 01:16:13.105113 2910 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 01:16:13.105370 kubelet[2910]: E0128 01:16:13.105239 2910 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7fx89,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-76f584f9b9-6dh2c_calico-apiserver(bead6395-8434-48df-aa67-e987782da70c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 28 01:16:13.106334 kubelet[2910]: E0128 01:16:13.106308 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76f584f9b9-6dh2c" podUID="bead6395-8434-48df-aa67-e987782da70c" Jan 28 01:16:15.750323 containerd[1674]: time="2026-01-28T01:16:15.750214311Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 28 01:16:16.093052 containerd[1674]: time="2026-01-28T01:16:16.092902097Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:16:16.094766 containerd[1674]: time="2026-01-28T01:16:16.094708684Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 28 01:16:16.095084 containerd[1674]: time="2026-01-28T01:16:16.094735318Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 28 01:16:16.095138 kubelet[2910]: E0128 01:16:16.095080 2910 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 28 01:16:16.095559 kubelet[2910]: E0128 01:16:16.095136 2910 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 28 01:16:16.095559 kubelet[2910]: E0128 01:16:16.095288 2910 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zgfdf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-lkr4f_calico-system(e60412d5-27c3-4569-9b64-5743c10cc437): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 28 01:16:16.097857 containerd[1674]: time="2026-01-28T01:16:16.097799221Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 28 01:16:16.436121 containerd[1674]: time="2026-01-28T01:16:16.435991067Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:16:16.437535 containerd[1674]: time="2026-01-28T01:16:16.437480847Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 28 01:16:16.437535 containerd[1674]: time="2026-01-28T01:16:16.437511965Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 28 01:16:16.437752 kubelet[2910]: E0128 01:16:16.437709 2910 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 28 01:16:16.437795 kubelet[2910]: E0128 01:16:16.437764 2910 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 28 01:16:16.438849 kubelet[2910]: E0128 01:16:16.438803 2910 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zgfdf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-lkr4f_calico-system(e60412d5-27c3-4569-9b64-5743c10cc437): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 28 01:16:16.440092 kubelet[2910]: E0128 01:16:16.440051 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-lkr4f" podUID="e60412d5-27c3-4569-9b64-5743c10cc437" Jan 28 01:16:19.749308 kubelet[2910]: E0128 01:16:19.748710 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76f584f9b9-9mjk9" podUID="7ea90b44-fc7d-4702-a1a5-1c558b3ecd80" Jan 28 01:16:22.750307 kubelet[2910]: E0128 01:16:22.750248 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-b767f6fff-w2bfk" podUID="3934179d-fb13-45a1-9643-cbd7ec08e773" Jan 28 01:16:24.751327 kubelet[2910]: E0128 01:16:24.751272 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76f584f9b9-6dh2c" podUID="bead6395-8434-48df-aa67-e987782da70c" Jan 28 01:16:25.748792 kubelet[2910]: E0128 01:16:25.748430 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-cblpt" podUID="c22de3ae-0a27-443f-9dd3-c4ab0a4176bd" Jan 28 01:16:26.748718 kubelet[2910]: E0128 01:16:26.748639 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-79f9cd9ddf-ggx6z" podUID="72726466-f235-4a31-a84a-a3699d8c85f7" Jan 28 01:16:30.749062 kubelet[2910]: E0128 01:16:30.748368 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-lkr4f" podUID="e60412d5-27c3-4569-9b64-5743c10cc437" Jan 28 01:16:34.750181 containerd[1674]: time="2026-01-28T01:16:34.749948872Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 28 01:16:35.078132 containerd[1674]: time="2026-01-28T01:16:35.078022565Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:16:35.079688 containerd[1674]: time="2026-01-28T01:16:35.079655320Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 28 01:16:35.079769 containerd[1674]: time="2026-01-28T01:16:35.079730393Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 28 01:16:35.079929 kubelet[2910]: E0128 01:16:35.079900 2910 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 01:16:35.081216 kubelet[2910]: E0128 01:16:35.079941 2910 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 01:16:35.081216 kubelet[2910]: E0128 01:16:35.080233 2910 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zgdzc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-76f584f9b9-9mjk9_calico-apiserver(7ea90b44-fc7d-4702-a1a5-1c558b3ecd80): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 28 01:16:35.081329 containerd[1674]: time="2026-01-28T01:16:35.081066079Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 28 01:16:35.081730 kubelet[2910]: E0128 01:16:35.081702 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76f584f9b9-9mjk9" podUID="7ea90b44-fc7d-4702-a1a5-1c558b3ecd80" Jan 28 01:16:35.406629 containerd[1674]: time="2026-01-28T01:16:35.406386571Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:16:35.408197 containerd[1674]: time="2026-01-28T01:16:35.408119248Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 28 01:16:35.408472 containerd[1674]: time="2026-01-28T01:16:35.408185388Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 28 01:16:35.408758 kubelet[2910]: E0128 01:16:35.408626 2910 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 28 01:16:35.408852 kubelet[2910]: E0128 01:16:35.408839 2910 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 28 01:16:35.409082 kubelet[2910]: E0128 01:16:35.409038 2910 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:8d1c1aed77714da78e5314e3a5e614ef,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4h8tc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-b767f6fff-w2bfk_calico-system(3934179d-fb13-45a1-9643-cbd7ec08e773): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 28 01:16:35.412129 containerd[1674]: time="2026-01-28T01:16:35.412108187Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 28 01:16:35.751633 containerd[1674]: time="2026-01-28T01:16:35.749088467Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:16:35.751633 containerd[1674]: time="2026-01-28T01:16:35.750816088Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 28 01:16:35.751633 containerd[1674]: time="2026-01-28T01:16:35.750889641Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 28 01:16:35.754186 kubelet[2910]: E0128 01:16:35.752416 2910 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 28 01:16:35.754186 kubelet[2910]: E0128 01:16:35.752456 2910 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 28 01:16:35.754186 kubelet[2910]: E0128 01:16:35.752581 2910 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4h8tc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-b767f6fff-w2bfk_calico-system(3934179d-fb13-45a1-9643-cbd7ec08e773): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 28 01:16:35.754539 kubelet[2910]: E0128 01:16:35.753806 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-b767f6fff-w2bfk" podUID="3934179d-fb13-45a1-9643-cbd7ec08e773" Jan 28 01:16:37.748436 containerd[1674]: time="2026-01-28T01:16:37.748197862Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 28 01:16:38.076628 containerd[1674]: time="2026-01-28T01:16:38.076525177Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:16:38.078179 containerd[1674]: time="2026-01-28T01:16:38.078144486Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 28 01:16:38.078262 containerd[1674]: time="2026-01-28T01:16:38.078221146Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 28 01:16:38.079190 kubelet[2910]: E0128 01:16:38.079146 2910 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 01:16:38.080349 kubelet[2910]: E0128 01:16:38.079202 2910 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 01:16:38.080349 kubelet[2910]: E0128 01:16:38.079328 2910 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7fx89,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-76f584f9b9-6dh2c_calico-apiserver(bead6395-8434-48df-aa67-e987782da70c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 28 01:16:38.080476 kubelet[2910]: E0128 01:16:38.080458 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76f584f9b9-6dh2c" podUID="bead6395-8434-48df-aa67-e987782da70c" Jan 28 01:16:38.751380 containerd[1674]: time="2026-01-28T01:16:38.751162780Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 28 01:16:39.094595 containerd[1674]: time="2026-01-28T01:16:39.094541756Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:16:39.097404 containerd[1674]: time="2026-01-28T01:16:39.097350695Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 28 01:16:39.097801 containerd[1674]: time="2026-01-28T01:16:39.097370847Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 28 01:16:39.097846 kubelet[2910]: E0128 01:16:39.097630 2910 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 28 01:16:39.097846 kubelet[2910]: E0128 01:16:39.097687 2910 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 28 01:16:39.098737 kubelet[2910]: E0128 01:16:39.097851 2910 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tqql6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-cblpt_calico-system(c22de3ae-0a27-443f-9dd3-c4ab0a4176bd): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 28 01:16:39.099251 kubelet[2910]: E0128 01:16:39.099198 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-cblpt" podUID="c22de3ae-0a27-443f-9dd3-c4ab0a4176bd" Jan 28 01:16:39.747618 containerd[1674]: time="2026-01-28T01:16:39.747551161Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 28 01:16:40.091914 containerd[1674]: time="2026-01-28T01:16:40.091136421Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:16:40.093094 containerd[1674]: time="2026-01-28T01:16:40.092998645Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 28 01:16:40.093226 containerd[1674]: time="2026-01-28T01:16:40.093179666Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 28 01:16:40.093346 kubelet[2910]: E0128 01:16:40.093314 2910 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 28 01:16:40.093386 kubelet[2910]: E0128 01:16:40.093358 2910 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 28 01:16:40.093514 kubelet[2910]: E0128 01:16:40.093480 2910 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8nnlh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-79f9cd9ddf-ggx6z_calico-system(72726466-f235-4a31-a84a-a3699d8c85f7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 28 01:16:40.094862 kubelet[2910]: E0128 01:16:40.094832 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-79f9cd9ddf-ggx6z" podUID="72726466-f235-4a31-a84a-a3699d8c85f7" Jan 28 01:16:44.749287 containerd[1674]: time="2026-01-28T01:16:44.749218781Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 28 01:16:45.097188 containerd[1674]: time="2026-01-28T01:16:45.097141637Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:16:45.099790 containerd[1674]: time="2026-01-28T01:16:45.099735577Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 28 01:16:45.100016 containerd[1674]: time="2026-01-28T01:16:45.099819325Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 28 01:16:45.100067 kubelet[2910]: E0128 01:16:45.100020 2910 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 28 01:16:45.100783 kubelet[2910]: E0128 01:16:45.100095 2910 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 28 01:16:45.100783 kubelet[2910]: E0128 01:16:45.100260 2910 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zgfdf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-lkr4f_calico-system(e60412d5-27c3-4569-9b64-5743c10cc437): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 28 01:16:45.103468 containerd[1674]: time="2026-01-28T01:16:45.103266860Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 28 01:16:45.460328 containerd[1674]: time="2026-01-28T01:16:45.460213884Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:16:45.462562 containerd[1674]: time="2026-01-28T01:16:45.462472896Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 28 01:16:45.462562 containerd[1674]: time="2026-01-28T01:16:45.462533744Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 28 01:16:45.462875 kubelet[2910]: E0128 01:16:45.462835 2910 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 28 01:16:45.462924 kubelet[2910]: E0128 01:16:45.462884 2910 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 28 01:16:45.463327 kubelet[2910]: E0128 01:16:45.463021 2910 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zgfdf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-lkr4f_calico-system(e60412d5-27c3-4569-9b64-5743c10cc437): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 28 01:16:45.464234 kubelet[2910]: E0128 01:16:45.464176 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-lkr4f" podUID="e60412d5-27c3-4569-9b64-5743c10cc437" Jan 28 01:16:49.748186 kubelet[2910]: E0128 01:16:49.747882 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76f584f9b9-6dh2c" podUID="bead6395-8434-48df-aa67-e987782da70c" Jan 28 01:16:49.748186 kubelet[2910]: E0128 01:16:49.747884 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76f584f9b9-9mjk9" podUID="7ea90b44-fc7d-4702-a1a5-1c558b3ecd80" Jan 28 01:16:50.749820 kubelet[2910]: E0128 01:16:50.749766 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-cblpt" podUID="c22de3ae-0a27-443f-9dd3-c4ab0a4176bd" Jan 28 01:16:50.751559 kubelet[2910]: E0128 01:16:50.751512 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-b767f6fff-w2bfk" podUID="3934179d-fb13-45a1-9643-cbd7ec08e773" Jan 28 01:16:52.749052 kubelet[2910]: E0128 01:16:52.748535 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-79f9cd9ddf-ggx6z" podUID="72726466-f235-4a31-a84a-a3699d8c85f7" Jan 28 01:17:00.754026 kubelet[2910]: E0128 01:17:00.753953 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-lkr4f" podUID="e60412d5-27c3-4569-9b64-5743c10cc437" Jan 28 01:17:01.752609 kubelet[2910]: E0128 01:17:01.748265 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76f584f9b9-6dh2c" podUID="bead6395-8434-48df-aa67-e987782da70c" Jan 28 01:17:03.749240 kubelet[2910]: E0128 01:17:03.749141 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76f584f9b9-9mjk9" podUID="7ea90b44-fc7d-4702-a1a5-1c558b3ecd80" Jan 28 01:17:03.750584 kubelet[2910]: E0128 01:17:03.749146 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-79f9cd9ddf-ggx6z" podUID="72726466-f235-4a31-a84a-a3699d8c85f7" Jan 28 01:17:05.747977 kubelet[2910]: E0128 01:17:05.747913 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-cblpt" podUID="c22de3ae-0a27-443f-9dd3-c4ab0a4176bd" Jan 28 01:17:05.749436 kubelet[2910]: E0128 01:17:05.749406 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-b767f6fff-w2bfk" podUID="3934179d-fb13-45a1-9643-cbd7ec08e773" Jan 28 01:17:14.749247 kubelet[2910]: E0128 01:17:14.749181 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-79f9cd9ddf-ggx6z" podUID="72726466-f235-4a31-a84a-a3699d8c85f7" Jan 28 01:17:14.750598 kubelet[2910]: E0128 01:17:14.750513 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-lkr4f" podUID="e60412d5-27c3-4569-9b64-5743c10cc437" Jan 28 01:17:15.748462 kubelet[2910]: E0128 01:17:15.748419 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76f584f9b9-6dh2c" podUID="bead6395-8434-48df-aa67-e987782da70c" Jan 28 01:17:16.761581 containerd[1674]: time="2026-01-28T01:17:16.761197799Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 28 01:17:17.172864 containerd[1674]: time="2026-01-28T01:17:17.172692989Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:17:17.175414 containerd[1674]: time="2026-01-28T01:17:17.175317358Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 28 01:17:17.175414 containerd[1674]: time="2026-01-28T01:17:17.175376958Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 28 01:17:17.175684 kubelet[2910]: E0128 01:17:17.175654 2910 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 28 01:17:17.177089 kubelet[2910]: E0128 01:17:17.176051 2910 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 28 01:17:17.177171 containerd[1674]: time="2026-01-28T01:17:17.176316856Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 28 01:17:17.177577 kubelet[2910]: E0128 01:17:17.177544 2910 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:8d1c1aed77714da78e5314e3a5e614ef,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4h8tc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-b767f6fff-w2bfk_calico-system(3934179d-fb13-45a1-9643-cbd7ec08e773): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 28 01:17:17.521193 containerd[1674]: time="2026-01-28T01:17:17.520462320Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:17:17.523989 containerd[1674]: time="2026-01-28T01:17:17.523890548Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 28 01:17:17.523989 containerd[1674]: time="2026-01-28T01:17:17.523945229Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 28 01:17:17.524299 kubelet[2910]: E0128 01:17:17.524270 2910 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 01:17:17.524385 kubelet[2910]: E0128 01:17:17.524375 2910 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 01:17:17.525451 kubelet[2910]: E0128 01:17:17.524628 2910 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zgdzc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-76f584f9b9-9mjk9_calico-apiserver(7ea90b44-fc7d-4702-a1a5-1c558b3ecd80): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 28 01:17:17.525600 containerd[1674]: time="2026-01-28T01:17:17.525276790Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 28 01:17:17.526185 kubelet[2910]: E0128 01:17:17.526161 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76f584f9b9-9mjk9" podUID="7ea90b44-fc7d-4702-a1a5-1c558b3ecd80" Jan 28 01:17:17.748530 kubelet[2910]: E0128 01:17:17.748482 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-cblpt" podUID="c22de3ae-0a27-443f-9dd3-c4ab0a4176bd" Jan 28 01:17:17.870434 containerd[1674]: time="2026-01-28T01:17:17.870267711Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:17:17.871950 containerd[1674]: time="2026-01-28T01:17:17.871816040Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 28 01:17:17.871950 containerd[1674]: time="2026-01-28T01:17:17.871893131Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 28 01:17:17.872358 kubelet[2910]: E0128 01:17:17.872303 2910 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 28 01:17:17.872358 kubelet[2910]: E0128 01:17:17.872357 2910 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 28 01:17:17.872521 kubelet[2910]: E0128 01:17:17.872473 2910 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4h8tc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-b767f6fff-w2bfk_calico-system(3934179d-fb13-45a1-9643-cbd7ec08e773): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 28 01:17:17.873891 kubelet[2910]: E0128 01:17:17.873848 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-b767f6fff-w2bfk" podUID="3934179d-fb13-45a1-9643-cbd7ec08e773" Jan 28 01:17:26.750492 containerd[1674]: time="2026-01-28T01:17:26.750451978Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 28 01:17:27.081572 containerd[1674]: time="2026-01-28T01:17:27.080670082Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:17:27.082822 containerd[1674]: time="2026-01-28T01:17:27.082731487Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 28 01:17:27.082822 containerd[1674]: time="2026-01-28T01:17:27.082771587Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 28 01:17:27.083071 kubelet[2910]: E0128 01:17:27.083034 2910 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 01:17:27.085093 kubelet[2910]: E0128 01:17:27.083079 2910 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 01:17:27.085093 kubelet[2910]: E0128 01:17:27.083202 2910 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7fx89,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-76f584f9b9-6dh2c_calico-apiserver(bead6395-8434-48df-aa67-e987782da70c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 28 01:17:27.085093 kubelet[2910]: E0128 01:17:27.084896 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76f584f9b9-6dh2c" podUID="bead6395-8434-48df-aa67-e987782da70c" Jan 28 01:17:27.748164 containerd[1674]: time="2026-01-28T01:17:27.748081645Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 28 01:17:28.077044 containerd[1674]: time="2026-01-28T01:17:28.076815689Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:17:28.078770 containerd[1674]: time="2026-01-28T01:17:28.078681917Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 28 01:17:28.078770 containerd[1674]: time="2026-01-28T01:17:28.078730452Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 28 01:17:28.078897 kubelet[2910]: E0128 01:17:28.078865 2910 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 28 01:17:28.078934 kubelet[2910]: E0128 01:17:28.078904 2910 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 28 01:17:28.079276 kubelet[2910]: E0128 01:17:28.079046 2910 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8nnlh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-79f9cd9ddf-ggx6z_calico-system(72726466-f235-4a31-a84a-a3699d8c85f7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 28 01:17:28.080208 kubelet[2910]: E0128 01:17:28.080182 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-79f9cd9ddf-ggx6z" podUID="72726466-f235-4a31-a84a-a3699d8c85f7" Jan 28 01:17:28.750217 kubelet[2910]: E0128 01:17:28.750168 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-b767f6fff-w2bfk" podUID="3934179d-fb13-45a1-9643-cbd7ec08e773" Jan 28 01:17:29.623022 kernel: hrtimer: interrupt took 2573456 ns Jan 28 01:17:29.749633 kubelet[2910]: E0128 01:17:29.748169 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76f584f9b9-9mjk9" podUID="7ea90b44-fc7d-4702-a1a5-1c558b3ecd80" Jan 28 01:17:29.749858 containerd[1674]: time="2026-01-28T01:17:29.748670519Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 28 01:17:30.102098 containerd[1674]: time="2026-01-28T01:17:30.102044241Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:17:30.104595 containerd[1674]: time="2026-01-28T01:17:30.104507314Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 28 01:17:30.104595 containerd[1674]: time="2026-01-28T01:17:30.104560856Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 28 01:17:30.104949 kubelet[2910]: E0128 01:17:30.104899 2910 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 28 01:17:30.105271 kubelet[2910]: E0128 01:17:30.104968 2910 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 28 01:17:30.105314 kubelet[2910]: E0128 01:17:30.105259 2910 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zgfdf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-lkr4f_calico-system(e60412d5-27c3-4569-9b64-5743c10cc437): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 28 01:17:30.108976 containerd[1674]: time="2026-01-28T01:17:30.108934373Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 28 01:17:30.430135 containerd[1674]: time="2026-01-28T01:17:30.429972807Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:17:30.528550 containerd[1674]: time="2026-01-28T01:17:30.528464008Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 28 01:17:30.528731 containerd[1674]: time="2026-01-28T01:17:30.528610071Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 28 01:17:30.531935 kubelet[2910]: E0128 01:17:30.531855 2910 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 28 01:17:30.532024 kubelet[2910]: E0128 01:17:30.531950 2910 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 28 01:17:30.532356 kubelet[2910]: E0128 01:17:30.532224 2910 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zgfdf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-lkr4f_calico-system(e60412d5-27c3-4569-9b64-5743c10cc437): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 28 01:17:30.533502 kubelet[2910]: E0128 01:17:30.533458 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-lkr4f" podUID="e60412d5-27c3-4569-9b64-5743c10cc437" Jan 28 01:17:30.751526 containerd[1674]: time="2026-01-28T01:17:30.751137994Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 28 01:17:31.081032 containerd[1674]: time="2026-01-28T01:17:31.080637191Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:17:31.082511 containerd[1674]: time="2026-01-28T01:17:31.082404134Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 28 01:17:31.082511 containerd[1674]: time="2026-01-28T01:17:31.082425818Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 28 01:17:31.082693 kubelet[2910]: E0128 01:17:31.082661 2910 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 28 01:17:31.082798 kubelet[2910]: E0128 01:17:31.082705 2910 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 28 01:17:31.083023 kubelet[2910]: E0128 01:17:31.082880 2910 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tqql6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-cblpt_calico-system(c22de3ae-0a27-443f-9dd3-c4ab0a4176bd): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 28 01:17:31.084288 kubelet[2910]: E0128 01:17:31.084250 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-cblpt" podUID="c22de3ae-0a27-443f-9dd3-c4ab0a4176bd" Jan 28 01:17:34.199315 update_engine[1660]: I20260128 01:17:34.198289 1660 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Jan 28 01:17:34.199315 update_engine[1660]: I20260128 01:17:34.198346 1660 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Jan 28 01:17:34.199315 update_engine[1660]: I20260128 01:17:34.198556 1660 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Jan 28 01:17:34.201931 update_engine[1660]: I20260128 01:17:34.201839 1660 omaha_request_params.cc:62] Current group set to alpha Jan 28 01:17:34.202304 update_engine[1660]: I20260128 01:17:34.202286 1660 update_attempter.cc:499] Already updated boot flags. Skipping. Jan 28 01:17:34.202387 update_engine[1660]: I20260128 01:17:34.202378 1660 update_attempter.cc:643] Scheduling an action processor start. Jan 28 01:17:34.202851 update_engine[1660]: I20260128 01:17:34.202833 1660 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Jan 28 01:17:34.213067 update_engine[1660]: I20260128 01:17:34.213021 1660 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Jan 28 01:17:34.213561 update_engine[1660]: I20260128 01:17:34.213306 1660 omaha_request_action.cc:271] Posting an Omaha request to disabled Jan 28 01:17:34.213632 locksmithd[1715]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Jan 28 01:17:34.214298 update_engine[1660]: I20260128 01:17:34.213891 1660 omaha_request_action.cc:272] Request: Jan 28 01:17:34.214298 update_engine[1660]: Jan 28 01:17:34.214298 update_engine[1660]: Jan 28 01:17:34.214298 update_engine[1660]: Jan 28 01:17:34.214298 update_engine[1660]: Jan 28 01:17:34.214298 update_engine[1660]: Jan 28 01:17:34.214298 update_engine[1660]: Jan 28 01:17:34.214298 update_engine[1660]: Jan 28 01:17:34.214298 update_engine[1660]: Jan 28 01:17:34.214298 update_engine[1660]: I20260128 01:17:34.213909 1660 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 28 01:17:34.221476 update_engine[1660]: I20260128 01:17:34.220145 1660 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 28 01:17:34.221476 update_engine[1660]: I20260128 01:17:34.220764 1660 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 28 01:17:34.228574 update_engine[1660]: E20260128 01:17:34.228488 1660 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Jan 28 01:17:34.228792 update_engine[1660]: I20260128 01:17:34.228778 1660 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Jan 28 01:17:41.747926 kubelet[2910]: E0128 01:17:41.747858 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76f584f9b9-6dh2c" podUID="bead6395-8434-48df-aa67-e987782da70c" Jan 28 01:17:41.749020 kubelet[2910]: E0128 01:17:41.748276 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-cblpt" podUID="c22de3ae-0a27-443f-9dd3-c4ab0a4176bd" Jan 28 01:17:42.750541 kubelet[2910]: E0128 01:17:42.750491 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-79f9cd9ddf-ggx6z" podUID="72726466-f235-4a31-a84a-a3699d8c85f7" Jan 28 01:17:43.748700 kubelet[2910]: E0128 01:17:43.748628 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-b767f6fff-w2bfk" podUID="3934179d-fb13-45a1-9643-cbd7ec08e773" Jan 28 01:17:44.111865 update_engine[1660]: I20260128 01:17:44.111182 1660 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 28 01:17:44.111865 update_engine[1660]: I20260128 01:17:44.111330 1660 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 28 01:17:44.111865 update_engine[1660]: I20260128 01:17:44.111750 1660 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 28 01:17:44.118167 update_engine[1660]: E20260128 01:17:44.117988 1660 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Jan 28 01:17:44.118167 update_engine[1660]: I20260128 01:17:44.118131 1660 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Jan 28 01:17:44.752180 kubelet[2910]: E0128 01:17:44.751361 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76f584f9b9-9mjk9" podUID="7ea90b44-fc7d-4702-a1a5-1c558b3ecd80" Jan 28 01:17:44.754194 kubelet[2910]: E0128 01:17:44.753218 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-lkr4f" podUID="e60412d5-27c3-4569-9b64-5743c10cc437" Jan 28 01:17:54.116821 update_engine[1660]: I20260128 01:17:54.114275 1660 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 28 01:17:54.123335 update_engine[1660]: I20260128 01:17:54.120085 1660 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 28 01:17:54.123335 update_engine[1660]: I20260128 01:17:54.121102 1660 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 28 01:17:54.127344 update_engine[1660]: E20260128 01:17:54.126965 1660 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Jan 28 01:17:54.127548 update_engine[1660]: I20260128 01:17:54.127518 1660 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Jan 28 01:17:55.747883 kubelet[2910]: E0128 01:17:55.747845 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-cblpt" podUID="c22de3ae-0a27-443f-9dd3-c4ab0a4176bd" Jan 28 01:17:56.750747 kubelet[2910]: E0128 01:17:56.750304 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76f584f9b9-6dh2c" podUID="bead6395-8434-48df-aa67-e987782da70c" Jan 28 01:17:56.750747 kubelet[2910]: E0128 01:17:56.750576 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-79f9cd9ddf-ggx6z" podUID="72726466-f235-4a31-a84a-a3699d8c85f7" Jan 28 01:17:56.753030 kubelet[2910]: E0128 01:17:56.752116 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-lkr4f" podUID="e60412d5-27c3-4569-9b64-5743c10cc437" Jan 28 01:17:58.749027 kubelet[2910]: E0128 01:17:58.748969 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76f584f9b9-9mjk9" podUID="7ea90b44-fc7d-4702-a1a5-1c558b3ecd80" Jan 28 01:17:58.750482 kubelet[2910]: E0128 01:17:58.750423 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-b767f6fff-w2bfk" podUID="3934179d-fb13-45a1-9643-cbd7ec08e773" Jan 28 01:18:04.115028 update_engine[1660]: I20260128 01:18:04.114326 1660 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 28 01:18:04.115501 update_engine[1660]: I20260128 01:18:04.115077 1660 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 28 01:18:04.115501 update_engine[1660]: I20260128 01:18:04.115422 1660 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 28 01:18:04.123106 update_engine[1660]: E20260128 01:18:04.123046 1660 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Jan 28 01:18:04.123270 update_engine[1660]: I20260128 01:18:04.123138 1660 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Jan 28 01:18:04.123270 update_engine[1660]: I20260128 01:18:04.123152 1660 omaha_request_action.cc:617] Omaha request response: Jan 28 01:18:04.123270 update_engine[1660]: E20260128 01:18:04.123252 1660 omaha_request_action.cc:636] Omaha request network transfer failed. Jan 28 01:18:04.124900 update_engine[1660]: I20260128 01:18:04.124863 1660 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Jan 28 01:18:04.124900 update_engine[1660]: I20260128 01:18:04.124887 1660 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Jan 28 01:18:04.124900 update_engine[1660]: I20260128 01:18:04.124894 1660 update_attempter.cc:306] Processing Done. Jan 28 01:18:04.125080 update_engine[1660]: E20260128 01:18:04.124910 1660 update_attempter.cc:619] Update failed. Jan 28 01:18:04.125080 update_engine[1660]: I20260128 01:18:04.124917 1660 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Jan 28 01:18:04.125080 update_engine[1660]: I20260128 01:18:04.124922 1660 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Jan 28 01:18:04.125080 update_engine[1660]: I20260128 01:18:04.124927 1660 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Jan 28 01:18:04.125080 update_engine[1660]: I20260128 01:18:04.125021 1660 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Jan 28 01:18:04.125080 update_engine[1660]: I20260128 01:18:04.125048 1660 omaha_request_action.cc:271] Posting an Omaha request to disabled Jan 28 01:18:04.125080 update_engine[1660]: I20260128 01:18:04.125054 1660 omaha_request_action.cc:272] Request: Jan 28 01:18:04.125080 update_engine[1660]: Jan 28 01:18:04.125080 update_engine[1660]: Jan 28 01:18:04.125080 update_engine[1660]: Jan 28 01:18:04.125080 update_engine[1660]: Jan 28 01:18:04.125080 update_engine[1660]: Jan 28 01:18:04.125080 update_engine[1660]: Jan 28 01:18:04.125080 update_engine[1660]: I20260128 01:18:04.125061 1660 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 28 01:18:04.126083 update_engine[1660]: I20260128 01:18:04.125085 1660 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 28 01:18:04.126083 update_engine[1660]: I20260128 01:18:04.125367 1660 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 28 01:18:04.126131 locksmithd[1715]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Jan 28 01:18:04.131901 update_engine[1660]: E20260128 01:18:04.131857 1660 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Jan 28 01:18:04.132093 update_engine[1660]: I20260128 01:18:04.131932 1660 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Jan 28 01:18:04.132093 update_engine[1660]: I20260128 01:18:04.131940 1660 omaha_request_action.cc:617] Omaha request response: Jan 28 01:18:04.132093 update_engine[1660]: I20260128 01:18:04.131947 1660 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Jan 28 01:18:04.132093 update_engine[1660]: I20260128 01:18:04.131953 1660 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Jan 28 01:18:04.132093 update_engine[1660]: I20260128 01:18:04.131958 1660 update_attempter.cc:306] Processing Done. Jan 28 01:18:04.132093 update_engine[1660]: I20260128 01:18:04.131965 1660 update_attempter.cc:310] Error event sent. Jan 28 01:18:04.132093 update_engine[1660]: I20260128 01:18:04.131974 1660 update_check_scheduler.cc:74] Next update check in 41m54s Jan 28 01:18:04.132705 locksmithd[1715]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Jan 28 01:18:08.749565 kubelet[2910]: E0128 01:18:08.749489 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-79f9cd9ddf-ggx6z" podUID="72726466-f235-4a31-a84a-a3699d8c85f7" Jan 28 01:18:09.758027 kubelet[2910]: E0128 01:18:09.757270 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-cblpt" podUID="c22de3ae-0a27-443f-9dd3-c4ab0a4176bd" Jan 28 01:18:09.758027 kubelet[2910]: E0128 01:18:09.757423 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76f584f9b9-9mjk9" podUID="7ea90b44-fc7d-4702-a1a5-1c558b3ecd80" Jan 28 01:18:09.764632 kubelet[2910]: E0128 01:18:09.759882 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-lkr4f" podUID="e60412d5-27c3-4569-9b64-5743c10cc437" Jan 28 01:18:09.764632 kubelet[2910]: E0128 01:18:09.760337 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-b767f6fff-w2bfk" podUID="3934179d-fb13-45a1-9643-cbd7ec08e773" Jan 28 01:18:11.749018 kubelet[2910]: E0128 01:18:11.748926 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76f584f9b9-6dh2c" podUID="bead6395-8434-48df-aa67-e987782da70c" Jan 28 01:18:20.749084 kubelet[2910]: E0128 01:18:20.749028 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-b767f6fff-w2bfk" podUID="3934179d-fb13-45a1-9643-cbd7ec08e773" Jan 28 01:18:21.747631 kubelet[2910]: E0128 01:18:21.747512 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-lkr4f" podUID="e60412d5-27c3-4569-9b64-5743c10cc437" Jan 28 01:18:22.748589 kubelet[2910]: E0128 01:18:22.748392 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76f584f9b9-9mjk9" podUID="7ea90b44-fc7d-4702-a1a5-1c558b3ecd80" Jan 28 01:18:22.750467 kubelet[2910]: E0128 01:18:22.750154 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-79f9cd9ddf-ggx6z" podUID="72726466-f235-4a31-a84a-a3699d8c85f7" Jan 28 01:18:23.747467 kubelet[2910]: E0128 01:18:23.747390 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-cblpt" podUID="c22de3ae-0a27-443f-9dd3-c4ab0a4176bd" Jan 28 01:18:24.758365 kubelet[2910]: E0128 01:18:24.758295 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76f584f9b9-6dh2c" podUID="bead6395-8434-48df-aa67-e987782da70c" Jan 28 01:18:33.748365 kubelet[2910]: E0128 01:18:33.748311 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-b767f6fff-w2bfk" podUID="3934179d-fb13-45a1-9643-cbd7ec08e773" Jan 28 01:18:34.751560 kubelet[2910]: E0128 01:18:34.751439 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-79f9cd9ddf-ggx6z" podUID="72726466-f235-4a31-a84a-a3699d8c85f7" Jan 28 01:18:34.755110 kubelet[2910]: E0128 01:18:34.755066 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-lkr4f" podUID="e60412d5-27c3-4569-9b64-5743c10cc437" Jan 28 01:18:37.748031 containerd[1674]: time="2026-01-28T01:18:37.747399540Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 28 01:18:38.114429 containerd[1674]: time="2026-01-28T01:18:38.114310941Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:18:38.116581 containerd[1674]: time="2026-01-28T01:18:38.116438829Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 28 01:18:38.116581 containerd[1674]: time="2026-01-28T01:18:38.116541173Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 28 01:18:38.117115 kubelet[2910]: E0128 01:18:38.116850 2910 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 01:18:38.117115 kubelet[2910]: E0128 01:18:38.116894 2910 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 01:18:38.117115 kubelet[2910]: E0128 01:18:38.117061 2910 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zgdzc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-76f584f9b9-9mjk9_calico-apiserver(7ea90b44-fc7d-4702-a1a5-1c558b3ecd80): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 28 01:18:38.118812 kubelet[2910]: E0128 01:18:38.118782 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76f584f9b9-9mjk9" podUID="7ea90b44-fc7d-4702-a1a5-1c558b3ecd80" Jan 28 01:18:38.750505 kubelet[2910]: E0128 01:18:38.749264 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-cblpt" podUID="c22de3ae-0a27-443f-9dd3-c4ab0a4176bd" Jan 28 01:18:39.748072 kubelet[2910]: E0128 01:18:39.748018 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76f584f9b9-6dh2c" podUID="bead6395-8434-48df-aa67-e987782da70c" Jan 28 01:18:45.748296 kubelet[2910]: E0128 01:18:45.748245 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-lkr4f" podUID="e60412d5-27c3-4569-9b64-5743c10cc437" Jan 28 01:18:48.750127 containerd[1674]: time="2026-01-28T01:18:48.750053426Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 28 01:18:49.085555 containerd[1674]: time="2026-01-28T01:18:49.085510582Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:18:49.087356 containerd[1674]: time="2026-01-28T01:18:49.087320732Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 28 01:18:49.087476 containerd[1674]: time="2026-01-28T01:18:49.087340760Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 28 01:18:49.087717 kubelet[2910]: E0128 01:18:49.087559 2910 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 28 01:18:49.087717 kubelet[2910]: E0128 01:18:49.087613 2910 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 28 01:18:49.088296 kubelet[2910]: E0128 01:18:49.088111 2910 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8nnlh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-79f9cd9ddf-ggx6z_calico-system(72726466-f235-4a31-a84a-a3699d8c85f7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 28 01:18:49.088401 containerd[1674]: time="2026-01-28T01:18:49.087921142Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 28 01:18:49.089296 kubelet[2910]: E0128 01:18:49.089259 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-79f9cd9ddf-ggx6z" podUID="72726466-f235-4a31-a84a-a3699d8c85f7" Jan 28 01:18:49.427415 containerd[1674]: time="2026-01-28T01:18:49.427243203Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:18:49.429061 containerd[1674]: time="2026-01-28T01:18:49.429025796Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 28 01:18:49.429217 containerd[1674]: time="2026-01-28T01:18:49.429103868Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 28 01:18:49.429402 kubelet[2910]: E0128 01:18:49.429363 2910 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 28 01:18:49.429510 kubelet[2910]: E0128 01:18:49.429491 2910 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 28 01:18:49.429702 kubelet[2910]: E0128 01:18:49.429677 2910 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:8d1c1aed77714da78e5314e3a5e614ef,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4h8tc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-b767f6fff-w2bfk_calico-system(3934179d-fb13-45a1-9643-cbd7ec08e773): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 28 01:18:49.431848 containerd[1674]: time="2026-01-28T01:18:49.431778656Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 28 01:18:49.749155 kubelet[2910]: E0128 01:18:49.747934 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-cblpt" podUID="c22de3ae-0a27-443f-9dd3-c4ab0a4176bd" Jan 28 01:18:49.777846 containerd[1674]: time="2026-01-28T01:18:49.777795084Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:18:49.779641 containerd[1674]: time="2026-01-28T01:18:49.779584754Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 28 01:18:49.779787 containerd[1674]: time="2026-01-28T01:18:49.779613432Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 28 01:18:49.779870 kubelet[2910]: E0128 01:18:49.779835 2910 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 28 01:18:49.779938 kubelet[2910]: E0128 01:18:49.779888 2910 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 28 01:18:49.780123 kubelet[2910]: E0128 01:18:49.780042 2910 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4h8tc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-b767f6fff-w2bfk_calico-system(3934179d-fb13-45a1-9643-cbd7ec08e773): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 28 01:18:49.781863 kubelet[2910]: E0128 01:18:49.781821 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-b767f6fff-w2bfk" podUID="3934179d-fb13-45a1-9643-cbd7ec08e773" Jan 28 01:18:50.749972 kubelet[2910]: E0128 01:18:50.749030 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76f584f9b9-9mjk9" podUID="7ea90b44-fc7d-4702-a1a5-1c558b3ecd80" Jan 28 01:18:53.749076 containerd[1674]: time="2026-01-28T01:18:53.747590381Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 28 01:18:54.098020 containerd[1674]: time="2026-01-28T01:18:54.097967135Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:18:54.099686 containerd[1674]: time="2026-01-28T01:18:54.099655182Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 28 01:18:54.099820 containerd[1674]: time="2026-01-28T01:18:54.099721152Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 28 01:18:54.099885 kubelet[2910]: E0128 01:18:54.099847 2910 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 01:18:54.100175 kubelet[2910]: E0128 01:18:54.099891 2910 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 01:18:54.100175 kubelet[2910]: E0128 01:18:54.100045 2910 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7fx89,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-76f584f9b9-6dh2c_calico-apiserver(bead6395-8434-48df-aa67-e987782da70c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 28 01:18:54.101479 kubelet[2910]: E0128 01:18:54.101447 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76f584f9b9-6dh2c" podUID="bead6395-8434-48df-aa67-e987782da70c" Jan 28 01:18:59.749459 containerd[1674]: time="2026-01-28T01:18:59.749179553Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 28 01:19:00.091084 containerd[1674]: time="2026-01-28T01:19:00.090993055Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:19:00.092868 containerd[1674]: time="2026-01-28T01:19:00.092806867Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 28 01:19:00.092868 containerd[1674]: time="2026-01-28T01:19:00.092844941Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 28 01:19:00.093325 kubelet[2910]: E0128 01:19:00.093215 2910 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 28 01:19:00.093325 kubelet[2910]: E0128 01:19:00.093265 2910 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 28 01:19:00.094000 kubelet[2910]: E0128 01:19:00.093915 2910 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zgfdf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-lkr4f_calico-system(e60412d5-27c3-4569-9b64-5743c10cc437): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 28 01:19:00.096087 containerd[1674]: time="2026-01-28T01:19:00.095919198Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 28 01:19:00.630976 containerd[1674]: time="2026-01-28T01:19:00.630826328Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:19:00.632504 containerd[1674]: time="2026-01-28T01:19:00.632403274Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 28 01:19:00.632588 containerd[1674]: time="2026-01-28T01:19:00.632496713Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 28 01:19:00.632669 kubelet[2910]: E0128 01:19:00.632635 2910 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 28 01:19:00.632710 kubelet[2910]: E0128 01:19:00.632679 2910 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 28 01:19:00.632831 kubelet[2910]: E0128 01:19:00.632795 2910 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zgfdf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-lkr4f_calico-system(e60412d5-27c3-4569-9b64-5743c10cc437): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 28 01:19:00.634132 kubelet[2910]: E0128 01:19:00.634103 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-lkr4f" podUID="e60412d5-27c3-4569-9b64-5743c10cc437" Jan 28 01:19:03.749028 kubelet[2910]: E0128 01:19:03.748275 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-79f9cd9ddf-ggx6z" podUID="72726466-f235-4a31-a84a-a3699d8c85f7" Jan 28 01:19:03.749451 containerd[1674]: time="2026-01-28T01:19:03.748399627Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 28 01:19:03.749684 kubelet[2910]: E0128 01:19:03.749067 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-b767f6fff-w2bfk" podUID="3934179d-fb13-45a1-9643-cbd7ec08e773" Jan 28 01:19:04.080053 containerd[1674]: time="2026-01-28T01:19:04.079717433Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:19:04.081403 containerd[1674]: time="2026-01-28T01:19:04.081356994Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 28 01:19:04.081480 containerd[1674]: time="2026-01-28T01:19:04.081428016Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 28 01:19:04.081840 kubelet[2910]: E0128 01:19:04.081584 2910 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 28 01:19:04.081840 kubelet[2910]: E0128 01:19:04.081629 2910 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 28 01:19:04.081840 kubelet[2910]: E0128 01:19:04.081761 2910 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tqql6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-cblpt_calico-system(c22de3ae-0a27-443f-9dd3-c4ab0a4176bd): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 28 01:19:04.083201 kubelet[2910]: E0128 01:19:04.083143 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-cblpt" podUID="c22de3ae-0a27-443f-9dd3-c4ab0a4176bd" Jan 28 01:19:05.748460 kubelet[2910]: E0128 01:19:05.748419 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76f584f9b9-9mjk9" podUID="7ea90b44-fc7d-4702-a1a5-1c558b3ecd80" Jan 28 01:19:07.749582 kubelet[2910]: E0128 01:19:07.749208 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76f584f9b9-6dh2c" podUID="bead6395-8434-48df-aa67-e987782da70c" Jan 28 01:19:11.748467 kubelet[2910]: E0128 01:19:11.748420 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-lkr4f" podUID="e60412d5-27c3-4569-9b64-5743c10cc437" Jan 28 01:19:12.557136 kernel: kauditd_printk_skb: 52 callbacks suppressed Jan 28 01:19:12.557245 kernel: audit: type=1130 audit(1769563152.552:745): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-10.0.0.143:22-20.161.92.111:42644 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:12.552000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-10.0.0.143:22-20.161.92.111:42644 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:12.553274 systemd[1]: Started sshd@7-10.0.0.143:22-20.161.92.111:42644.service - OpenSSH per-connection server daemon (20.161.92.111:42644). Jan 28 01:19:13.114000 audit[5181]: USER_ACCT pid=5181 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:13.119474 sshd[5181]: Accepted publickey for core from 20.161.92.111 port 42644 ssh2: RSA SHA256:u7tV79mK9Za6w/yPdRl0UmbGy3sJOXYjH/g+qt6t2hM Jan 28 01:19:13.119714 sshd-session[5181]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 01:19:13.120037 kernel: audit: type=1101 audit(1769563153.114:746): pid=5181 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:13.123149 kernel: audit: type=1103 audit(1769563153.116:747): pid=5181 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:13.116000 audit[5181]: CRED_ACQ pid=5181 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:13.116000 audit[5181]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffdbe955f30 a2=3 a3=0 items=0 ppid=1 pid=5181 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=9 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:19:13.126625 kernel: audit: type=1006 audit(1769563153.116:748): pid=5181 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=9 res=1 Jan 28 01:19:13.126719 kernel: audit: type=1300 audit(1769563153.116:748): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffdbe955f30 a2=3 a3=0 items=0 ppid=1 pid=5181 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=9 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:19:13.116000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 01:19:13.131041 kernel: audit: type=1327 audit(1769563153.116:748): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 01:19:13.135425 systemd-logind[1658]: New session 9 of user core. Jan 28 01:19:13.143208 systemd[1]: Started session-9.scope - Session 9 of User core. Jan 28 01:19:13.151000 audit[5181]: USER_START pid=5181 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:13.157043 kernel: audit: type=1105 audit(1769563153.151:749): pid=5181 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:13.155000 audit[5187]: CRED_ACQ pid=5187 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:13.161043 kernel: audit: type=1103 audit(1769563153.155:750): pid=5187 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:13.496404 sshd[5187]: Connection closed by 20.161.92.111 port 42644 Jan 28 01:19:13.496901 sshd-session[5181]: pam_unix(sshd:session): session closed for user core Jan 28 01:19:13.499000 audit[5181]: USER_END pid=5181 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:13.506026 kernel: audit: type=1106 audit(1769563153.499:751): pid=5181 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:13.508096 systemd[1]: sshd@7-10.0.0.143:22-20.161.92.111:42644.service: Deactivated successfully. Jan 28 01:19:13.505000 audit[5181]: CRED_DISP pid=5181 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:13.510690 systemd[1]: session-9.scope: Deactivated successfully. Jan 28 01:19:13.511819 systemd-logind[1658]: Session 9 logged out. Waiting for processes to exit. Jan 28 01:19:13.513029 kernel: audit: type=1104 audit(1769563153.505:752): pid=5181 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:13.507000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-10.0.0.143:22-20.161.92.111:42644 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:13.516297 systemd-logind[1658]: Removed session 9. Jan 28 01:19:15.748476 kubelet[2910]: E0128 01:19:15.748421 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-79f9cd9ddf-ggx6z" podUID="72726466-f235-4a31-a84a-a3699d8c85f7" Jan 28 01:19:16.750804 kubelet[2910]: E0128 01:19:16.750765 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-cblpt" podUID="c22de3ae-0a27-443f-9dd3-c4ab0a4176bd" Jan 28 01:19:17.748405 kubelet[2910]: E0128 01:19:17.748144 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76f584f9b9-9mjk9" podUID="7ea90b44-fc7d-4702-a1a5-1c558b3ecd80" Jan 28 01:19:17.749033 kubelet[2910]: E0128 01:19:17.748988 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-b767f6fff-w2bfk" podUID="3934179d-fb13-45a1-9643-cbd7ec08e773" Jan 28 01:19:18.603475 systemd[1]: Started sshd@8-10.0.0.143:22-20.161.92.111:42658.service - OpenSSH per-connection server daemon (20.161.92.111:42658). Jan 28 01:19:18.602000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.0.0.143:22-20.161.92.111:42658 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:18.605157 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 28 01:19:18.605224 kernel: audit: type=1130 audit(1769563158.602:754): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.0.0.143:22-20.161.92.111:42658 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:19.136000 audit[5200]: USER_ACCT pid=5200 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:19.142499 sshd[5200]: Accepted publickey for core from 20.161.92.111 port 42658 ssh2: RSA SHA256:u7tV79mK9Za6w/yPdRl0UmbGy3sJOXYjH/g+qt6t2hM Jan 28 01:19:19.144174 kernel: audit: type=1101 audit(1769563159.136:755): pid=5200 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:19.144666 sshd-session[5200]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 01:19:19.149096 kernel: audit: type=1103 audit(1769563159.142:756): pid=5200 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:19.142000 audit[5200]: CRED_ACQ pid=5200 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:19.160028 kernel: audit: type=1006 audit(1769563159.142:757): pid=5200 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=10 res=1 Jan 28 01:19:19.142000 audit[5200]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffcde7197e0 a2=3 a3=0 items=0 ppid=1 pid=5200 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=10 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:19:19.166409 systemd-logind[1658]: New session 10 of user core. Jan 28 01:19:19.168498 kernel: audit: type=1300 audit(1769563159.142:757): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffcde7197e0 a2=3 a3=0 items=0 ppid=1 pid=5200 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=10 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:19:19.169305 systemd[1]: Started session-10.scope - Session 10 of User core. Jan 28 01:19:19.142000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 01:19:19.175029 kernel: audit: type=1327 audit(1769563159.142:757): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 01:19:19.180000 audit[5200]: USER_START pid=5200 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:19.188035 kernel: audit: type=1105 audit(1769563159.180:758): pid=5200 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:19.189000 audit[5213]: CRED_ACQ pid=5213 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:19.197022 kernel: audit: type=1103 audit(1769563159.189:759): pid=5213 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:19.524778 sshd[5213]: Connection closed by 20.161.92.111 port 42658 Jan 28 01:19:19.525652 sshd-session[5200]: pam_unix(sshd:session): session closed for user core Jan 28 01:19:19.528000 audit[5200]: USER_END pid=5200 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:19.534929 systemd[1]: sshd@8-10.0.0.143:22-20.161.92.111:42658.service: Deactivated successfully. Jan 28 01:19:19.535188 kernel: audit: type=1106 audit(1769563159.528:760): pid=5200 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:19.537972 systemd[1]: session-10.scope: Deactivated successfully. Jan 28 01:19:19.538829 systemd-logind[1658]: Session 10 logged out. Waiting for processes to exit. Jan 28 01:19:19.528000 audit[5200]: CRED_DISP pid=5200 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:19.543846 systemd-logind[1658]: Removed session 10. Jan 28 01:19:19.545094 kernel: audit: type=1104 audit(1769563159.528:761): pid=5200 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:19.534000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.0.0.143:22-20.161.92.111:42658 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:19.749062 kubelet[2910]: E0128 01:19:19.748966 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76f584f9b9-6dh2c" podUID="bead6395-8434-48df-aa67-e987782da70c" Jan 28 01:19:24.633204 systemd[1]: Started sshd@9-10.0.0.143:22-20.161.92.111:46588.service - OpenSSH per-connection server daemon (20.161.92.111:46588). Jan 28 01:19:24.632000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.0.0.143:22-20.161.92.111:46588 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:24.634072 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 28 01:19:24.635224 kernel: audit: type=1130 audit(1769563164.632:763): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.0.0.143:22-20.161.92.111:46588 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:24.751242 kubelet[2910]: E0128 01:19:24.751192 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-lkr4f" podUID="e60412d5-27c3-4569-9b64-5743c10cc437" Jan 28 01:19:25.154000 audit[5249]: USER_ACCT pid=5249 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:25.161182 kernel: audit: type=1101 audit(1769563165.154:764): pid=5249 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:25.161264 sshd[5249]: Accepted publickey for core from 20.161.92.111 port 46588 ssh2: RSA SHA256:u7tV79mK9Za6w/yPdRl0UmbGy3sJOXYjH/g+qt6t2hM Jan 28 01:19:25.161000 audit[5249]: CRED_ACQ pid=5249 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:25.167485 kernel: audit: type=1103 audit(1769563165.161:765): pid=5249 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:25.163870 sshd-session[5249]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 01:19:25.172375 kernel: audit: type=1006 audit(1769563165.162:766): pid=5249 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=11 res=1 Jan 28 01:19:25.162000 audit[5249]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe3ce6d920 a2=3 a3=0 items=0 ppid=1 pid=5249 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:19:25.178798 kernel: audit: type=1300 audit(1769563165.162:766): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe3ce6d920 a2=3 a3=0 items=0 ppid=1 pid=5249 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:19:25.162000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 01:19:25.183198 kernel: audit: type=1327 audit(1769563165.162:766): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 01:19:25.187361 systemd-logind[1658]: New session 11 of user core. Jan 28 01:19:25.192357 systemd[1]: Started session-11.scope - Session 11 of User core. Jan 28 01:19:25.205122 kernel: audit: type=1105 audit(1769563165.196:767): pid=5249 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:25.196000 audit[5249]: USER_START pid=5249 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:25.204000 audit[5255]: CRED_ACQ pid=5255 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:25.212127 kernel: audit: type=1103 audit(1769563165.204:768): pid=5255 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:25.524928 sshd[5255]: Connection closed by 20.161.92.111 port 46588 Jan 28 01:19:25.524217 sshd-session[5249]: pam_unix(sshd:session): session closed for user core Jan 28 01:19:25.525000 audit[5249]: USER_END pid=5249 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:25.530867 systemd[1]: sshd@9-10.0.0.143:22-20.161.92.111:46588.service: Deactivated successfully. Jan 28 01:19:25.532240 kernel: audit: type=1106 audit(1769563165.525:769): pid=5249 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:25.525000 audit[5249]: CRED_DISP pid=5249 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:25.534869 systemd[1]: session-11.scope: Deactivated successfully. Jan 28 01:19:25.530000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.0.0.143:22-20.161.92.111:46588 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:25.537086 kernel: audit: type=1104 audit(1769563165.525:770): pid=5249 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:25.537159 systemd-logind[1658]: Session 11 logged out. Waiting for processes to exit. Jan 28 01:19:25.539580 systemd-logind[1658]: Removed session 11. Jan 28 01:19:25.627000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.0.0.143:22-20.161.92.111:46598 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:25.628253 systemd[1]: Started sshd@10-10.0.0.143:22-20.161.92.111:46598.service - OpenSSH per-connection server daemon (20.161.92.111:46598). Jan 28 01:19:26.146000 audit[5268]: USER_ACCT pid=5268 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:26.148145 sshd[5268]: Accepted publickey for core from 20.161.92.111 port 46598 ssh2: RSA SHA256:u7tV79mK9Za6w/yPdRl0UmbGy3sJOXYjH/g+qt6t2hM Jan 28 01:19:26.147000 audit[5268]: CRED_ACQ pid=5268 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:26.147000 audit[5268]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff56ec5750 a2=3 a3=0 items=0 ppid=1 pid=5268 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=12 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:19:26.147000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 01:19:26.149677 sshd-session[5268]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 01:19:26.155399 systemd-logind[1658]: New session 12 of user core. Jan 28 01:19:26.160184 systemd[1]: Started session-12.scope - Session 12 of User core. Jan 28 01:19:26.162000 audit[5268]: USER_START pid=5268 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:26.164000 audit[5276]: CRED_ACQ pid=5276 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:26.530888 sshd[5276]: Connection closed by 20.161.92.111 port 46598 Jan 28 01:19:26.531366 sshd-session[5268]: pam_unix(sshd:session): session closed for user core Jan 28 01:19:26.532000 audit[5268]: USER_END pid=5268 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:26.532000 audit[5268]: CRED_DISP pid=5268 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:26.536369 systemd-logind[1658]: Session 12 logged out. Waiting for processes to exit. Jan 28 01:19:26.536923 systemd[1]: sshd@10-10.0.0.143:22-20.161.92.111:46598.service: Deactivated successfully. Jan 28 01:19:26.536000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.0.0.143:22-20.161.92.111:46598 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:26.538810 systemd[1]: session-12.scope: Deactivated successfully. Jan 28 01:19:26.540316 systemd-logind[1658]: Removed session 12. Jan 28 01:19:26.635000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-10.0.0.143:22-20.161.92.111:46614 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:26.636491 systemd[1]: Started sshd@11-10.0.0.143:22-20.161.92.111:46614.service - OpenSSH per-connection server daemon (20.161.92.111:46614). Jan 28 01:19:27.159000 audit[5286]: USER_ACCT pid=5286 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:27.160824 sshd[5286]: Accepted publickey for core from 20.161.92.111 port 46614 ssh2: RSA SHA256:u7tV79mK9Za6w/yPdRl0UmbGy3sJOXYjH/g+qt6t2hM Jan 28 01:19:27.160000 audit[5286]: CRED_ACQ pid=5286 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:27.160000 audit[5286]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffdd1d2d3d0 a2=3 a3=0 items=0 ppid=1 pid=5286 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:19:27.160000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 01:19:27.164146 sshd-session[5286]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 01:19:27.169951 systemd-logind[1658]: New session 13 of user core. Jan 28 01:19:27.174171 systemd[1]: Started session-13.scope - Session 13 of User core. Jan 28 01:19:27.177000 audit[5286]: USER_START pid=5286 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:27.179000 audit[5290]: CRED_ACQ pid=5290 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:27.527224 sshd[5290]: Connection closed by 20.161.92.111 port 46614 Jan 28 01:19:27.528915 sshd-session[5286]: pam_unix(sshd:session): session closed for user core Jan 28 01:19:27.529000 audit[5286]: USER_END pid=5286 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:27.530000 audit[5286]: CRED_DISP pid=5286 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:27.534923 systemd[1]: sshd@11-10.0.0.143:22-20.161.92.111:46614.service: Deactivated successfully. Jan 28 01:19:27.534000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-10.0.0.143:22-20.161.92.111:46614 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:27.540117 systemd[1]: session-13.scope: Deactivated successfully. Jan 28 01:19:27.542411 systemd-logind[1658]: Session 13 logged out. Waiting for processes to exit. Jan 28 01:19:27.543728 systemd-logind[1658]: Removed session 13. Jan 28 01:19:27.747550 kubelet[2910]: E0128 01:19:27.747513 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-cblpt" podUID="c22de3ae-0a27-443f-9dd3-c4ab0a4176bd" Jan 28 01:19:28.750277 kubelet[2910]: E0128 01:19:28.749221 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76f584f9b9-9mjk9" podUID="7ea90b44-fc7d-4702-a1a5-1c558b3ecd80" Jan 28 01:19:28.751426 kubelet[2910]: E0128 01:19:28.750945 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-b767f6fff-w2bfk" podUID="3934179d-fb13-45a1-9643-cbd7ec08e773" Jan 28 01:19:29.749194 kubelet[2910]: E0128 01:19:29.749055 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-79f9cd9ddf-ggx6z" podUID="72726466-f235-4a31-a84a-a3699d8c85f7" Jan 28 01:19:31.748297 kubelet[2910]: E0128 01:19:31.748233 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76f584f9b9-6dh2c" podUID="bead6395-8434-48df-aa67-e987782da70c" Jan 28 01:19:32.633664 systemd[1]: Started sshd@12-10.0.0.143:22-20.161.92.111:41688.service - OpenSSH per-connection server daemon (20.161.92.111:41688). Jan 28 01:19:32.632000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-10.0.0.143:22-20.161.92.111:41688 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:32.636021 kernel: kauditd_printk_skb: 23 callbacks suppressed Jan 28 01:19:32.636109 kernel: audit: type=1130 audit(1769563172.632:790): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-10.0.0.143:22-20.161.92.111:41688 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:33.147000 audit[5301]: USER_ACCT pid=5301 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:33.150178 sshd[5301]: Accepted publickey for core from 20.161.92.111 port 41688 ssh2: RSA SHA256:u7tV79mK9Za6w/yPdRl0UmbGy3sJOXYjH/g+qt6t2hM Jan 28 01:19:33.154262 kernel: audit: type=1101 audit(1769563173.147:791): pid=5301 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:33.154100 sshd-session[5301]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 01:19:33.152000 audit[5301]: CRED_ACQ pid=5301 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:33.160326 kernel: audit: type=1103 audit(1769563173.152:792): pid=5301 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:33.160393 kernel: audit: type=1006 audit(1769563173.152:793): pid=5301 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=14 res=1 Jan 28 01:19:33.152000 audit[5301]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc323bc930 a2=3 a3=0 items=0 ppid=1 pid=5301 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=14 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:19:33.163846 kernel: audit: type=1300 audit(1769563173.152:793): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc323bc930 a2=3 a3=0 items=0 ppid=1 pid=5301 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=14 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:19:33.152000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 01:19:33.167400 kernel: audit: type=1327 audit(1769563173.152:793): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 01:19:33.171390 systemd-logind[1658]: New session 14 of user core. Jan 28 01:19:33.177283 systemd[1]: Started session-14.scope - Session 14 of User core. Jan 28 01:19:33.181000 audit[5301]: USER_START pid=5301 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:33.188030 kernel: audit: type=1105 audit(1769563173.181:794): pid=5301 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:33.187000 audit[5305]: CRED_ACQ pid=5305 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:33.193038 kernel: audit: type=1103 audit(1769563173.187:795): pid=5305 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:33.503444 sshd[5305]: Connection closed by 20.161.92.111 port 41688 Jan 28 01:19:33.504113 sshd-session[5301]: pam_unix(sshd:session): session closed for user core Jan 28 01:19:33.511000 audit[5301]: USER_END pid=5301 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:33.517456 systemd[1]: sshd@12-10.0.0.143:22-20.161.92.111:41688.service: Deactivated successfully. Jan 28 01:19:33.518023 kernel: audit: type=1106 audit(1769563173.511:796): pid=5301 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:33.511000 audit[5301]: CRED_DISP pid=5301 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:33.521711 systemd[1]: session-14.scope: Deactivated successfully. Jan 28 01:19:33.523385 systemd-logind[1658]: Session 14 logged out. Waiting for processes to exit. Jan 28 01:19:33.524031 kernel: audit: type=1104 audit(1769563173.511:797): pid=5301 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:33.517000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-10.0.0.143:22-20.161.92.111:41688 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:33.525713 systemd-logind[1658]: Removed session 14. Jan 28 01:19:33.609000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-10.0.0.143:22-20.161.92.111:41704 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:33.610265 systemd[1]: Started sshd@13-10.0.0.143:22-20.161.92.111:41704.service - OpenSSH per-connection server daemon (20.161.92.111:41704). Jan 28 01:19:34.139000 audit[5318]: USER_ACCT pid=5318 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:34.141475 sshd[5318]: Accepted publickey for core from 20.161.92.111 port 41704 ssh2: RSA SHA256:u7tV79mK9Za6w/yPdRl0UmbGy3sJOXYjH/g+qt6t2hM Jan 28 01:19:34.141000 audit[5318]: CRED_ACQ pid=5318 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:34.141000 audit[5318]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffdd4b1bd60 a2=3 a3=0 items=0 ppid=1 pid=5318 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=15 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:19:34.141000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 01:19:34.143179 sshd-session[5318]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 01:19:34.147481 systemd-logind[1658]: New session 15 of user core. Jan 28 01:19:34.154259 systemd[1]: Started session-15.scope - Session 15 of User core. Jan 28 01:19:34.155000 audit[5318]: USER_START pid=5318 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:34.157000 audit[5322]: CRED_ACQ pid=5322 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:34.823984 sshd[5322]: Connection closed by 20.161.92.111 port 41704 Jan 28 01:19:34.824981 sshd-session[5318]: pam_unix(sshd:session): session closed for user core Jan 28 01:19:34.825000 audit[5318]: USER_END pid=5318 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:34.825000 audit[5318]: CRED_DISP pid=5318 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:34.830402 systemd[1]: sshd@13-10.0.0.143:22-20.161.92.111:41704.service: Deactivated successfully. Jan 28 01:19:34.829000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-10.0.0.143:22-20.161.92.111:41704 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:34.833389 systemd[1]: session-15.scope: Deactivated successfully. Jan 28 01:19:34.836380 systemd-logind[1658]: Session 15 logged out. Waiting for processes to exit. Jan 28 01:19:34.837500 systemd-logind[1658]: Removed session 15. Jan 28 01:19:34.931330 systemd[1]: Started sshd@14-10.0.0.143:22-20.161.92.111:41710.service - OpenSSH per-connection server daemon (20.161.92.111:41710). Jan 28 01:19:34.930000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.0.0.143:22-20.161.92.111:41710 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:35.455000 audit[5333]: USER_ACCT pid=5333 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:35.456993 sshd[5333]: Accepted publickey for core from 20.161.92.111 port 41710 ssh2: RSA SHA256:u7tV79mK9Za6w/yPdRl0UmbGy3sJOXYjH/g+qt6t2hM Jan 28 01:19:35.456000 audit[5333]: CRED_ACQ pid=5333 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:35.456000 audit[5333]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff82c782f0 a2=3 a3=0 items=0 ppid=1 pid=5333 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:19:35.456000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 01:19:35.458601 sshd-session[5333]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 01:19:35.463134 systemd-logind[1658]: New session 16 of user core. Jan 28 01:19:35.473398 systemd[1]: Started session-16.scope - Session 16 of User core. Jan 28 01:19:35.475000 audit[5333]: USER_START pid=5333 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:35.478000 audit[5338]: CRED_ACQ pid=5338 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:36.346000 audit[5348]: NETFILTER_CFG table=filter:140 family=2 entries=26 op=nft_register_rule pid=5348 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:19:36.346000 audit[5348]: SYSCALL arch=c000003e syscall=46 success=yes exit=14176 a0=3 a1=7fff4f9fbb80 a2=0 a3=7fff4f9fbb6c items=0 ppid=3019 pid=5348 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:19:36.346000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:19:36.352000 audit[5348]: NETFILTER_CFG table=nat:141 family=2 entries=20 op=nft_register_rule pid=5348 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:19:36.352000 audit[5348]: SYSCALL arch=c000003e syscall=46 success=yes exit=5772 a0=3 a1=7fff4f9fbb80 a2=0 a3=0 items=0 ppid=3019 pid=5348 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:19:36.352000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:19:36.375000 audit[5350]: NETFILTER_CFG table=filter:142 family=2 entries=38 op=nft_register_rule pid=5350 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:19:36.375000 audit[5350]: SYSCALL arch=c000003e syscall=46 success=yes exit=14176 a0=3 a1=7ffe6ffb92b0 a2=0 a3=7ffe6ffb929c items=0 ppid=3019 pid=5350 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:19:36.375000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:19:36.382000 audit[5350]: NETFILTER_CFG table=nat:143 family=2 entries=20 op=nft_register_rule pid=5350 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:19:36.382000 audit[5350]: SYSCALL arch=c000003e syscall=46 success=yes exit=5772 a0=3 a1=7ffe6ffb92b0 a2=0 a3=0 items=0 ppid=3019 pid=5350 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:19:36.382000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:19:36.446819 sshd[5338]: Connection closed by 20.161.92.111 port 41710 Jan 28 01:19:36.447226 sshd-session[5333]: pam_unix(sshd:session): session closed for user core Jan 28 01:19:36.449000 audit[5333]: USER_END pid=5333 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:36.449000 audit[5333]: CRED_DISP pid=5333 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:36.452470 systemd-logind[1658]: Session 16 logged out. Waiting for processes to exit. Jan 28 01:19:36.453052 systemd[1]: sshd@14-10.0.0.143:22-20.161.92.111:41710.service: Deactivated successfully. Jan 28 01:19:36.452000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.0.0.143:22-20.161.92.111:41710 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:36.454945 systemd[1]: session-16.scope: Deactivated successfully. Jan 28 01:19:36.456603 systemd-logind[1658]: Removed session 16. Jan 28 01:19:36.554199 systemd[1]: Started sshd@15-10.0.0.143:22-20.161.92.111:41718.service - OpenSSH per-connection server daemon (20.161.92.111:41718). Jan 28 01:19:36.553000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-10.0.0.143:22-20.161.92.111:41718 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:37.078000 audit[5355]: USER_ACCT pid=5355 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:37.080145 sshd[5355]: Accepted publickey for core from 20.161.92.111 port 41718 ssh2: RSA SHA256:u7tV79mK9Za6w/yPdRl0UmbGy3sJOXYjH/g+qt6t2hM Jan 28 01:19:37.079000 audit[5355]: CRED_ACQ pid=5355 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:37.079000 audit[5355]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd1d872630 a2=3 a3=0 items=0 ppid=1 pid=5355 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:19:37.079000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 01:19:37.081669 sshd-session[5355]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 01:19:37.087839 systemd-logind[1658]: New session 17 of user core. Jan 28 01:19:37.092177 systemd[1]: Started session-17.scope - Session 17 of User core. Jan 28 01:19:37.094000 audit[5355]: USER_START pid=5355 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:37.096000 audit[5359]: CRED_ACQ pid=5359 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:37.597219 sshd[5359]: Connection closed by 20.161.92.111 port 41718 Jan 28 01:19:37.598223 sshd-session[5355]: pam_unix(sshd:session): session closed for user core Jan 28 01:19:37.599000 audit[5355]: USER_END pid=5355 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:37.599000 audit[5355]: CRED_DISP pid=5355 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:37.602422 systemd[1]: sshd@15-10.0.0.143:22-20.161.92.111:41718.service: Deactivated successfully. Jan 28 01:19:37.601000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-10.0.0.143:22-20.161.92.111:41718 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:37.604542 systemd[1]: session-17.scope: Deactivated successfully. Jan 28 01:19:37.605517 systemd-logind[1658]: Session 17 logged out. Waiting for processes to exit. Jan 28 01:19:37.607580 systemd-logind[1658]: Removed session 17. Jan 28 01:19:37.702458 systemd[1]: Started sshd@16-10.0.0.143:22-20.161.92.111:41734.service - OpenSSH per-connection server daemon (20.161.92.111:41734). Jan 28 01:19:37.707338 kernel: kauditd_printk_skb: 46 callbacks suppressed Jan 28 01:19:37.707422 kernel: audit: type=1130 audit(1769563177.701:830): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-10.0.0.143:22-20.161.92.111:41734 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:37.701000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-10.0.0.143:22-20.161.92.111:41734 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:38.219000 audit[5369]: USER_ACCT pid=5369 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:38.222898 sshd-session[5369]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 01:19:38.225034 kernel: audit: type=1101 audit(1769563178.219:831): pid=5369 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:38.225068 sshd[5369]: Accepted publickey for core from 20.161.92.111 port 41734 ssh2: RSA SHA256:u7tV79mK9Za6w/yPdRl0UmbGy3sJOXYjH/g+qt6t2hM Jan 28 01:19:38.219000 audit[5369]: CRED_ACQ pid=5369 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:38.229041 kernel: audit: type=1103 audit(1769563178.219:832): pid=5369 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:38.232566 systemd-logind[1658]: New session 18 of user core. Jan 28 01:19:38.219000 audit[5369]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffcde4ea120 a2=3 a3=0 items=0 ppid=1 pid=5369 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=18 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:19:38.235232 kernel: audit: type=1006 audit(1769563178.219:833): pid=5369 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=18 res=1 Jan 28 01:19:38.235278 kernel: audit: type=1300 audit(1769563178.219:833): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffcde4ea120 a2=3 a3=0 items=0 ppid=1 pid=5369 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=18 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:19:38.219000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 01:19:38.238222 systemd[1]: Started session-18.scope - Session 18 of User core. Jan 28 01:19:38.239024 kernel: audit: type=1327 audit(1769563178.219:833): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 01:19:38.241000 audit[5369]: USER_START pid=5369 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:38.241000 audit[5373]: CRED_ACQ pid=5373 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:38.248981 kernel: audit: type=1105 audit(1769563178.241:834): pid=5369 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:38.249098 kernel: audit: type=1103 audit(1769563178.241:835): pid=5373 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:38.581769 sshd[5373]: Connection closed by 20.161.92.111 port 41734 Jan 28 01:19:38.582520 sshd-session[5369]: pam_unix(sshd:session): session closed for user core Jan 28 01:19:38.584000 audit[5369]: USER_END pid=5369 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:38.591232 kernel: audit: type=1106 audit(1769563178.584:836): pid=5369 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:38.589754 systemd[1]: sshd@16-10.0.0.143:22-20.161.92.111:41734.service: Deactivated successfully. Jan 28 01:19:38.584000 audit[5369]: CRED_DISP pid=5369 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:38.593386 systemd[1]: session-18.scope: Deactivated successfully. Jan 28 01:19:38.596957 kernel: audit: type=1104 audit(1769563178.584:837): pid=5369 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:38.598661 systemd-logind[1658]: Session 18 logged out. Waiting for processes to exit. Jan 28 01:19:38.602268 systemd-logind[1658]: Removed session 18. Jan 28 01:19:38.588000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-10.0.0.143:22-20.161.92.111:41734 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:39.748182 kubelet[2910]: E0128 01:19:39.747763 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-lkr4f" podUID="e60412d5-27c3-4569-9b64-5743c10cc437" Jan 28 01:19:39.749754 kubelet[2910]: E0128 01:19:39.748702 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-cblpt" podUID="c22de3ae-0a27-443f-9dd3-c4ab0a4176bd" Jan 28 01:19:40.748760 kubelet[2910]: E0128 01:19:40.747929 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-b767f6fff-w2bfk" podUID="3934179d-fb13-45a1-9643-cbd7ec08e773" Jan 28 01:19:41.352000 audit[5386]: NETFILTER_CFG table=filter:144 family=2 entries=26 op=nft_register_rule pid=5386 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:19:41.352000 audit[5386]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7ffd8b6e0da0 a2=0 a3=7ffd8b6e0d8c items=0 ppid=3019 pid=5386 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:19:41.352000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:19:41.357000 audit[5386]: NETFILTER_CFG table=nat:145 family=2 entries=104 op=nft_register_chain pid=5386 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:19:41.357000 audit[5386]: SYSCALL arch=c000003e syscall=46 success=yes exit=48684 a0=3 a1=7ffd8b6e0da0 a2=0 a3=7ffd8b6e0d8c items=0 ppid=3019 pid=5386 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:19:41.357000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:19:41.747719 kubelet[2910]: E0128 01:19:41.747453 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-79f9cd9ddf-ggx6z" podUID="72726466-f235-4a31-a84a-a3699d8c85f7" Jan 28 01:19:43.689211 kernel: kauditd_printk_skb: 7 callbacks suppressed Jan 28 01:19:43.689302 kernel: audit: type=1130 audit(1769563183.687:841): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-10.0.0.143:22-20.161.92.111:45308 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:43.687000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-10.0.0.143:22-20.161.92.111:45308 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:43.688269 systemd[1]: Started sshd@17-10.0.0.143:22-20.161.92.111:45308.service - OpenSSH per-connection server daemon (20.161.92.111:45308). Jan 28 01:19:43.747429 kubelet[2910]: E0128 01:19:43.747395 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76f584f9b9-9mjk9" podUID="7ea90b44-fc7d-4702-a1a5-1c558b3ecd80" Jan 28 01:19:44.213000 audit[5388]: USER_ACCT pid=5388 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:44.215915 sshd[5388]: Accepted publickey for core from 20.161.92.111 port 45308 ssh2: RSA SHA256:u7tV79mK9Za6w/yPdRl0UmbGy3sJOXYjH/g+qt6t2hM Jan 28 01:19:44.220046 kernel: audit: type=1101 audit(1769563184.213:842): pid=5388 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:44.220652 sshd-session[5388]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 01:19:44.218000 audit[5388]: CRED_ACQ pid=5388 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:44.226533 kernel: audit: type=1103 audit(1769563184.218:843): pid=5388 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:44.226618 kernel: audit: type=1006 audit(1769563184.218:844): pid=5388 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=19 res=1 Jan 28 01:19:44.230038 kernel: audit: type=1300 audit(1769563184.218:844): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff5ba7eae0 a2=3 a3=0 items=0 ppid=1 pid=5388 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=19 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:19:44.218000 audit[5388]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff5ba7eae0 a2=3 a3=0 items=0 ppid=1 pid=5388 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=19 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:19:44.218000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 01:19:44.236396 systemd-logind[1658]: New session 19 of user core. Jan 28 01:19:44.237285 kernel: audit: type=1327 audit(1769563184.218:844): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 01:19:44.243203 systemd[1]: Started session-19.scope - Session 19 of User core. Jan 28 01:19:44.245000 audit[5388]: USER_START pid=5388 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:44.248000 audit[5392]: CRED_ACQ pid=5392 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:44.253076 kernel: audit: type=1105 audit(1769563184.245:845): pid=5388 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:44.253110 kernel: audit: type=1103 audit(1769563184.248:846): pid=5392 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:44.561710 sshd[5392]: Connection closed by 20.161.92.111 port 45308 Jan 28 01:19:44.561569 sshd-session[5388]: pam_unix(sshd:session): session closed for user core Jan 28 01:19:44.562000 audit[5388]: USER_END pid=5388 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:44.566388 systemd-logind[1658]: Session 19 logged out. Waiting for processes to exit. Jan 28 01:19:44.568777 systemd[1]: sshd@17-10.0.0.143:22-20.161.92.111:45308.service: Deactivated successfully. Jan 28 01:19:44.571118 kernel: audit: type=1106 audit(1769563184.562:847): pid=5388 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:44.571781 systemd[1]: session-19.scope: Deactivated successfully. Jan 28 01:19:44.562000 audit[5388]: CRED_DISP pid=5388 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:44.576359 systemd-logind[1658]: Removed session 19. Jan 28 01:19:44.577096 kernel: audit: type=1104 audit(1769563184.562:848): pid=5388 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:44.566000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-10.0.0.143:22-20.161.92.111:45308 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:46.749610 kubelet[2910]: E0128 01:19:46.749299 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76f584f9b9-6dh2c" podUID="bead6395-8434-48df-aa67-e987782da70c" Jan 28 01:19:49.667000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-10.0.0.143:22-20.161.92.111:45314 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:49.668714 systemd[1]: Started sshd@18-10.0.0.143:22-20.161.92.111:45314.service - OpenSSH per-connection server daemon (20.161.92.111:45314). Jan 28 01:19:49.670526 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 28 01:19:49.670576 kernel: audit: type=1130 audit(1769563189.667:850): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-10.0.0.143:22-20.161.92.111:45314 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:50.208000 audit[5406]: USER_ACCT pid=5406 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:50.211356 sshd[5406]: Accepted publickey for core from 20.161.92.111 port 45314 ssh2: RSA SHA256:u7tV79mK9Za6w/yPdRl0UmbGy3sJOXYjH/g+qt6t2hM Jan 28 01:19:50.212000 audit[5406]: CRED_ACQ pid=5406 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:50.215040 sshd-session[5406]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 01:19:50.215496 kernel: audit: type=1101 audit(1769563190.208:851): pid=5406 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:50.215531 kernel: audit: type=1103 audit(1769563190.212:852): pid=5406 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:50.219016 kernel: audit: type=1006 audit(1769563190.213:853): pid=5406 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=20 res=1 Jan 28 01:19:50.213000 audit[5406]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffda3bffab0 a2=3 a3=0 items=0 ppid=1 pid=5406 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=20 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:19:50.222337 kernel: audit: type=1300 audit(1769563190.213:853): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffda3bffab0 a2=3 a3=0 items=0 ppid=1 pid=5406 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=20 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:19:50.213000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 01:19:50.225505 kernel: audit: type=1327 audit(1769563190.213:853): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 01:19:50.229023 systemd-logind[1658]: New session 20 of user core. Jan 28 01:19:50.234175 systemd[1]: Started session-20.scope - Session 20 of User core. Jan 28 01:19:50.237000 audit[5406]: USER_START pid=5406 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:50.243089 kernel: audit: type=1105 audit(1769563190.237:854): pid=5406 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:50.243000 audit[5410]: CRED_ACQ pid=5410 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:50.248104 kernel: audit: type=1103 audit(1769563190.243:855): pid=5410 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:50.333683 systemd[1846]: Created slice background.slice - User Background Tasks Slice. Jan 28 01:19:50.334888 systemd[1846]: Starting systemd-tmpfiles-clean.service - Cleanup of User's Temporary Files and Directories... Jan 28 01:19:50.360667 systemd[1846]: Finished systemd-tmpfiles-clean.service - Cleanup of User's Temporary Files and Directories. Jan 28 01:19:50.556381 sshd[5410]: Connection closed by 20.161.92.111 port 45314 Jan 28 01:19:50.555084 sshd-session[5406]: pam_unix(sshd:session): session closed for user core Jan 28 01:19:50.555000 audit[5406]: USER_END pid=5406 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:50.559169 systemd[1]: sshd@18-10.0.0.143:22-20.161.92.111:45314.service: Deactivated successfully. Jan 28 01:19:50.562084 kernel: audit: type=1106 audit(1769563190.555:856): pid=5406 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:50.560733 systemd[1]: session-20.scope: Deactivated successfully. Jan 28 01:19:50.561177 systemd-logind[1658]: Session 20 logged out. Waiting for processes to exit. Jan 28 01:19:50.566206 kernel: audit: type=1104 audit(1769563190.556:857): pid=5406 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:50.556000 audit[5406]: CRED_DISP pid=5406 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:50.564964 systemd-logind[1658]: Removed session 20. Jan 28 01:19:50.558000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-10.0.0.143:22-20.161.92.111:45314 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:50.748846 kubelet[2910]: E0128 01:19:50.748606 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-cblpt" podUID="c22de3ae-0a27-443f-9dd3-c4ab0a4176bd" Jan 28 01:19:52.749960 kubelet[2910]: E0128 01:19:52.749586 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-79f9cd9ddf-ggx6z" podUID="72726466-f235-4a31-a84a-a3699d8c85f7" Jan 28 01:19:52.750772 kubelet[2910]: E0128 01:19:52.750020 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-lkr4f" podUID="e60412d5-27c3-4569-9b64-5743c10cc437" Jan 28 01:19:55.664916 systemd[1]: Started sshd@19-10.0.0.143:22-20.161.92.111:52588.service - OpenSSH per-connection server daemon (20.161.92.111:52588). Jan 28 01:19:55.664000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-10.0.0.143:22-20.161.92.111:52588 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:55.666496 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 28 01:19:55.666548 kernel: audit: type=1130 audit(1769563195.664:859): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-10.0.0.143:22-20.161.92.111:52588 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:55.748824 kubelet[2910]: E0128 01:19:55.748738 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-b767f6fff-w2bfk" podUID="3934179d-fb13-45a1-9643-cbd7ec08e773" Jan 28 01:19:56.192000 audit[5447]: USER_ACCT pid=5447 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:56.199247 kernel: audit: type=1101 audit(1769563196.192:860): pid=5447 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:56.199316 sshd[5447]: Accepted publickey for core from 20.161.92.111 port 52588 ssh2: RSA SHA256:u7tV79mK9Za6w/yPdRl0UmbGy3sJOXYjH/g+qt6t2hM Jan 28 01:19:56.200715 sshd-session[5447]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 01:19:56.205076 kernel: audit: type=1103 audit(1769563196.198:861): pid=5447 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:56.198000 audit[5447]: CRED_ACQ pid=5447 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:56.213168 kernel: audit: type=1006 audit(1769563196.198:862): pid=5447 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=21 res=1 Jan 28 01:19:56.214632 kernel: audit: type=1300 audit(1769563196.198:862): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffceb5fad50 a2=3 a3=0 items=0 ppid=1 pid=5447 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=21 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:19:56.198000 audit[5447]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffceb5fad50 a2=3 a3=0 items=0 ppid=1 pid=5447 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=21 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:19:56.198000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 01:19:56.218071 kernel: audit: type=1327 audit(1769563196.198:862): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 01:19:56.221641 systemd-logind[1658]: New session 21 of user core. Jan 28 01:19:56.228152 systemd[1]: Started session-21.scope - Session 21 of User core. Jan 28 01:19:56.230000 audit[5447]: USER_START pid=5447 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:56.237128 kernel: audit: type=1105 audit(1769563196.230:863): pid=5447 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:56.236000 audit[5451]: CRED_ACQ pid=5451 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:56.241089 kernel: audit: type=1103 audit(1769563196.236:864): pid=5451 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:56.561481 sshd[5451]: Connection closed by 20.161.92.111 port 52588 Jan 28 01:19:56.561338 sshd-session[5447]: pam_unix(sshd:session): session closed for user core Jan 28 01:19:56.561000 audit[5447]: USER_END pid=5447 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:56.567373 systemd[1]: sshd@19-10.0.0.143:22-20.161.92.111:52588.service: Deactivated successfully. Jan 28 01:19:56.569046 kernel: audit: type=1106 audit(1769563196.561:865): pid=5447 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:56.569091 kernel: audit: type=1104 audit(1769563196.561:866): pid=5447 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:56.561000 audit[5447]: CRED_DISP pid=5447 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:19:56.567000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-10.0.0.143:22-20.161.92.111:52588 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:56.573435 systemd[1]: session-21.scope: Deactivated successfully. Jan 28 01:19:56.575056 systemd-logind[1658]: Session 21 logged out. Waiting for processes to exit. Jan 28 01:19:56.576997 systemd-logind[1658]: Removed session 21. Jan 28 01:19:56.749393 kubelet[2910]: E0128 01:19:56.749351 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76f584f9b9-9mjk9" podUID="7ea90b44-fc7d-4702-a1a5-1c558b3ecd80" Jan 28 01:19:58.747746 kubelet[2910]: E0128 01:19:58.747691 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76f584f9b9-6dh2c" podUID="bead6395-8434-48df-aa67-e987782da70c" Jan 28 01:20:01.675128 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 28 01:20:01.675487 kernel: audit: type=1130 audit(1769563201.669:868): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.0.0.143:22-20.161.92.111:52590 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:20:01.669000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.0.0.143:22-20.161.92.111:52590 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:20:01.670293 systemd[1]: Started sshd@20-10.0.0.143:22-20.161.92.111:52590.service - OpenSSH per-connection server daemon (20.161.92.111:52590). Jan 28 01:20:01.748652 kubelet[2910]: E0128 01:20:01.748616 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-cblpt" podUID="c22de3ae-0a27-443f-9dd3-c4ab0a4176bd" Jan 28 01:20:02.203166 sshd[5463]: Accepted publickey for core from 20.161.92.111 port 52590 ssh2: RSA SHA256:u7tV79mK9Za6w/yPdRl0UmbGy3sJOXYjH/g+qt6t2hM Jan 28 01:20:02.210850 kernel: audit: type=1101 audit(1769563202.202:869): pid=5463 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:20:02.202000 audit[5463]: USER_ACCT pid=5463 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:20:02.208817 sshd-session[5463]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 01:20:02.207000 audit[5463]: CRED_ACQ pid=5463 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:20:02.216024 kernel: audit: type=1103 audit(1769563202.207:870): pid=5463 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:20:02.216122 kernel: audit: type=1006 audit(1769563202.207:871): pid=5463 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=22 res=1 Jan 28 01:20:02.207000 audit[5463]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd7cb6de30 a2=3 a3=0 items=0 ppid=1 pid=5463 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=22 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:02.207000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 01:20:02.225573 kernel: audit: type=1300 audit(1769563202.207:871): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd7cb6de30 a2=3 a3=0 items=0 ppid=1 pid=5463 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=22 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:02.225635 kernel: audit: type=1327 audit(1769563202.207:871): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 01:20:02.229094 systemd-logind[1658]: New session 22 of user core. Jan 28 01:20:02.231227 systemd[1]: Started session-22.scope - Session 22 of User core. Jan 28 01:20:02.234000 audit[5463]: USER_START pid=5463 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:20:02.241040 kernel: audit: type=1105 audit(1769563202.234:872): pid=5463 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:20:02.240000 audit[5467]: CRED_ACQ pid=5467 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:20:02.245029 kernel: audit: type=1103 audit(1769563202.240:873): pid=5467 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:20:02.586029 sshd[5467]: Connection closed by 20.161.92.111 port 52590 Jan 28 01:20:02.586109 sshd-session[5463]: pam_unix(sshd:session): session closed for user core Jan 28 01:20:02.588000 audit[5463]: USER_END pid=5463 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:20:02.593770 systemd[1]: sshd@20-10.0.0.143:22-20.161.92.111:52590.service: Deactivated successfully. Jan 28 01:20:02.596103 kernel: audit: type=1106 audit(1769563202.588:874): pid=5463 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:20:02.594105 systemd-logind[1658]: Session 22 logged out. Waiting for processes to exit. Jan 28 01:20:02.598257 systemd[1]: session-22.scope: Deactivated successfully. Jan 28 01:20:02.588000 audit[5463]: CRED_DISP pid=5463 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:20:02.601362 systemd-logind[1658]: Removed session 22. Jan 28 01:20:02.603030 kernel: audit: type=1104 audit(1769563202.588:875): pid=5463 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 28 01:20:02.592000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.0.0.143:22-20.161.92.111:52590 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:20:03.747444 kubelet[2910]: E0128 01:20:03.747105 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-79f9cd9ddf-ggx6z" podUID="72726466-f235-4a31-a84a-a3699d8c85f7" Jan 28 01:20:05.747844 kubelet[2910]: E0128 01:20:05.747796 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-lkr4f" podUID="e60412d5-27c3-4569-9b64-5743c10cc437" Jan 28 01:20:06.748669 kubelet[2910]: E0128 01:20:06.748478 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-b767f6fff-w2bfk" podUID="3934179d-fb13-45a1-9643-cbd7ec08e773" Jan 28 01:20:08.539932 containerd[1674]: time="2026-01-28T01:20:08.539847856Z" level=info msg="container event discarded" container=a4c671a3f6e5d162760c93f92377b4b4b277ce9a29f70b1d2f9a21da6f076bb1 type=CONTAINER_CREATED_EVENT Jan 28 01:20:08.551174 containerd[1674]: time="2026-01-28T01:20:08.551090799Z" level=info msg="container event discarded" container=a4c671a3f6e5d162760c93f92377b4b4b277ce9a29f70b1d2f9a21da6f076bb1 type=CONTAINER_STARTED_EVENT Jan 28 01:20:08.584263 containerd[1674]: time="2026-01-28T01:20:08.584189615Z" level=info msg="container event discarded" container=4a3f0473acec8d174e9c2270090b9b3ee89f8cbe506ed06db368d2d88c0b7d86 type=CONTAINER_CREATED_EVENT Jan 28 01:20:08.606419 containerd[1674]: time="2026-01-28T01:20:08.606347391Z" level=info msg="container event discarded" container=6f1059959f18e2bd6d121d76702a30dbac1942f0edec9d402362c5c6b49b4aec type=CONTAINER_CREATED_EVENT Jan 28 01:20:08.606419 containerd[1674]: time="2026-01-28T01:20:08.606403281Z" level=info msg="container event discarded" container=6f1059959f18e2bd6d121d76702a30dbac1942f0edec9d402362c5c6b49b4aec type=CONTAINER_STARTED_EVENT Jan 28 01:20:08.622656 containerd[1674]: time="2026-01-28T01:20:08.622583448Z" level=info msg="container event discarded" container=dd34bf28c83eb44ad4e782281d36d86b686f604095d4b27294bf6684fec53017 type=CONTAINER_CREATED_EVENT Jan 28 01:20:08.622656 containerd[1674]: time="2026-01-28T01:20:08.622632281Z" level=info msg="container event discarded" container=dd34bf28c83eb44ad4e782281d36d86b686f604095d4b27294bf6684fec53017 type=CONTAINER_STARTED_EVENT Jan 28 01:20:08.638242 containerd[1674]: time="2026-01-28T01:20:08.638170171Z" level=info msg="container event discarded" container=1a932d77c96795e154c74d3263506db2b3265230dee8cdeb3e8525c61bb9d3f2 type=CONTAINER_CREATED_EVENT Jan 28 01:20:08.657486 containerd[1674]: time="2026-01-28T01:20:08.657400115Z" level=info msg="container event discarded" container=3d2963883752648312d4707aec27a87dbc756e5bf172de762e5ce01812457d01 type=CONTAINER_CREATED_EVENT Jan 28 01:20:08.680664 containerd[1674]: time="2026-01-28T01:20:08.680612284Z" level=info msg="container event discarded" container=4a3f0473acec8d174e9c2270090b9b3ee89f8cbe506ed06db368d2d88c0b7d86 type=CONTAINER_STARTED_EVENT Jan 28 01:20:08.763103 containerd[1674]: time="2026-01-28T01:20:08.763044015Z" level=info msg="container event discarded" container=1a932d77c96795e154c74d3263506db2b3265230dee8cdeb3e8525c61bb9d3f2 type=CONTAINER_STARTED_EVENT Jan 28 01:20:08.796433 containerd[1674]: time="2026-01-28T01:20:08.796002060Z" level=info msg="container event discarded" container=3d2963883752648312d4707aec27a87dbc756e5bf172de762e5ce01812457d01 type=CONTAINER_STARTED_EVENT Jan 28 01:20:10.748369 kubelet[2910]: E0128 01:20:10.748087 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76f584f9b9-9mjk9" podUID="7ea90b44-fc7d-4702-a1a5-1c558b3ecd80" Jan 28 01:20:12.748799 kubelet[2910]: E0128 01:20:12.748720 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76f584f9b9-6dh2c" podUID="bead6395-8434-48df-aa67-e987782da70c" Jan 28 01:20:13.749149 kubelet[2910]: E0128 01:20:13.748821 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-cblpt" podUID="c22de3ae-0a27-443f-9dd3-c4ab0a4176bd" Jan 28 01:20:14.747438 kubelet[2910]: E0128 01:20:14.747392 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-79f9cd9ddf-ggx6z" podUID="72726466-f235-4a31-a84a-a3699d8c85f7" Jan 28 01:20:18.689310 containerd[1674]: time="2026-01-28T01:20:18.689204885Z" level=info msg="container event discarded" container=74d5596222ee60da4e6264ad5d4922b63fd8316997f9f69439be61f0f4b715b0 type=CONTAINER_CREATED_EVENT Jan 28 01:20:18.689730 containerd[1674]: time="2026-01-28T01:20:18.689335925Z" level=info msg="container event discarded" container=74d5596222ee60da4e6264ad5d4922b63fd8316997f9f69439be61f0f4b715b0 type=CONTAINER_STARTED_EVENT Jan 28 01:20:18.733964 containerd[1674]: time="2026-01-28T01:20:18.733059974Z" level=info msg="container event discarded" container=5af2ed17c1b13ce173285ea19144aea06ce70b4151dffe055e67f6e853db036c type=CONTAINER_CREATED_EVENT Jan 28 01:20:18.749972 kubelet[2910]: E0128 01:20:18.749921 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-lkr4f" podUID="e60412d5-27c3-4569-9b64-5743c10cc437" Jan 28 01:20:18.912693 containerd[1674]: time="2026-01-28T01:20:18.912614480Z" level=info msg="container event discarded" container=5af2ed17c1b13ce173285ea19144aea06ce70b4151dffe055e67f6e853db036c type=CONTAINER_STARTED_EVENT Jan 28 01:20:19.235933 containerd[1674]: time="2026-01-28T01:20:19.235864999Z" level=info msg="container event discarded" container=3ad4aab15ad1bec082c373d2e01cdc315cb022af293343f5b7ab0212d77df4ce type=CONTAINER_CREATED_EVENT Jan 28 01:20:19.235933 containerd[1674]: time="2026-01-28T01:20:19.235914900Z" level=info msg="container event discarded" container=3ad4aab15ad1bec082c373d2e01cdc315cb022af293343f5b7ab0212d77df4ce type=CONTAINER_STARTED_EVENT Jan 28 01:20:20.750307 kubelet[2910]: E0128 01:20:20.750078 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-b767f6fff-w2bfk" podUID="3934179d-fb13-45a1-9643-cbd7ec08e773" Jan 28 01:20:21.396047 containerd[1674]: time="2026-01-28T01:20:21.395952097Z" level=info msg="container event discarded" container=2374a2656d3ad6cdf412683953a7f092a77a151c586ed24a959f4098645ab623 type=CONTAINER_CREATED_EVENT Jan 28 01:20:21.451750 containerd[1674]: time="2026-01-28T01:20:21.451668213Z" level=info msg="container event discarded" container=2374a2656d3ad6cdf412683953a7f092a77a151c586ed24a959f4098645ab623 type=CONTAINER_STARTED_EVENT Jan 28 01:20:25.749035 kubelet[2910]: E0128 01:20:25.748838 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76f584f9b9-6dh2c" podUID="bead6395-8434-48df-aa67-e987782da70c" Jan 28 01:20:25.749966 kubelet[2910]: E0128 01:20:25.749915 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76f584f9b9-9mjk9" podUID="7ea90b44-fc7d-4702-a1a5-1c558b3ecd80" Jan 28 01:20:26.747624 kubelet[2910]: E0128 01:20:26.747553 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-79f9cd9ddf-ggx6z" podUID="72726466-f235-4a31-a84a-a3699d8c85f7" Jan 28 01:20:28.748079 kubelet[2910]: E0128 01:20:28.747908 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-cblpt" podUID="c22de3ae-0a27-443f-9dd3-c4ab0a4176bd" Jan 28 01:20:29.856325 systemd[1]: cri-containerd-1a932d77c96795e154c74d3263506db2b3265230dee8cdeb3e8525c61bb9d3f2.scope: Deactivated successfully. Jan 28 01:20:29.860666 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 28 01:20:29.860887 kernel: audit: type=1334 audit(1769563229.857:877): prog-id=256 op=LOAD Jan 28 01:20:29.857000 audit: BPF prog-id=256 op=LOAD Jan 28 01:20:29.856980 systemd[1]: cri-containerd-1a932d77c96795e154c74d3263506db2b3265230dee8cdeb3e8525c61bb9d3f2.scope: Consumed 3.558s CPU time, 60.2M memory peak, 192K read from disk. Jan 28 01:20:29.861097 containerd[1674]: time="2026-01-28T01:20:29.858467847Z" level=info msg="received container exit event container_id:\"1a932d77c96795e154c74d3263506db2b3265230dee8cdeb3e8525c61bb9d3f2\" id:\"1a932d77c96795e154c74d3263506db2b3265230dee8cdeb3e8525c61bb9d3f2\" pid:2755 exit_status:1 exited_at:{seconds:1769563229 nanos:858135187}" Jan 28 01:20:29.857000 audit: BPF prog-id=88 op=UNLOAD Jan 28 01:20:29.864020 kernel: audit: type=1334 audit(1769563229.857:878): prog-id=88 op=UNLOAD Jan 28 01:20:29.863000 audit: BPF prog-id=103 op=UNLOAD Jan 28 01:20:29.863000 audit: BPF prog-id=107 op=UNLOAD Jan 28 01:20:29.867477 kernel: audit: type=1334 audit(1769563229.863:879): prog-id=103 op=UNLOAD Jan 28 01:20:29.867528 kernel: audit: type=1334 audit(1769563229.863:880): prog-id=107 op=UNLOAD Jan 28 01:20:29.885329 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-1a932d77c96795e154c74d3263506db2b3265230dee8cdeb3e8525c61bb9d3f2-rootfs.mount: Deactivated successfully. Jan 28 01:20:30.264940 kubelet[2910]: E0128 01:20:30.264738 2910 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.143:50308->10.0.0.202:2379: read: connection timed out" Jan 28 01:20:30.347402 systemd[1]: cri-containerd-2374a2656d3ad6cdf412683953a7f092a77a151c586ed24a959f4098645ab623.scope: Deactivated successfully. Jan 28 01:20:30.347760 systemd[1]: cri-containerd-2374a2656d3ad6cdf412683953a7f092a77a151c586ed24a959f4098645ab623.scope: Consumed 39.472s CPU time, 116.8M memory peak. Jan 28 01:20:30.348000 audit: BPF prog-id=146 op=UNLOAD Jan 28 01:20:30.352134 kernel: audit: type=1334 audit(1769563230.348:881): prog-id=146 op=UNLOAD Jan 28 01:20:30.352205 kernel: audit: type=1334 audit(1769563230.348:882): prog-id=150 op=UNLOAD Jan 28 01:20:30.348000 audit: BPF prog-id=150 op=UNLOAD Jan 28 01:20:30.354284 containerd[1674]: time="2026-01-28T01:20:30.354255662Z" level=info msg="received container exit event container_id:\"2374a2656d3ad6cdf412683953a7f092a77a151c586ed24a959f4098645ab623\" id:\"2374a2656d3ad6cdf412683953a7f092a77a151c586ed24a959f4098645ab623\" pid:3235 exit_status:1 exited_at:{seconds:1769563230 nanos:353850698}" Jan 28 01:20:30.377594 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-2374a2656d3ad6cdf412683953a7f092a77a151c586ed24a959f4098645ab623-rootfs.mount: Deactivated successfully. Jan 28 01:20:30.631061 kubelet[2910]: I0128 01:20:30.630727 2910 scope.go:117] "RemoveContainer" containerID="1a932d77c96795e154c74d3263506db2b3265230dee8cdeb3e8525c61bb9d3f2" Jan 28 01:20:30.633521 kubelet[2910]: I0128 01:20:30.633253 2910 scope.go:117] "RemoveContainer" containerID="2374a2656d3ad6cdf412683953a7f092a77a151c586ed24a959f4098645ab623" Jan 28 01:20:30.634625 containerd[1674]: time="2026-01-28T01:20:30.634583035Z" level=info msg="CreateContainer within sandbox \"6f1059959f18e2bd6d121d76702a30dbac1942f0edec9d402362c5c6b49b4aec\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Jan 28 01:20:30.636247 containerd[1674]: time="2026-01-28T01:20:30.635855139Z" level=info msg="CreateContainer within sandbox \"3ad4aab15ad1bec082c373d2e01cdc315cb022af293343f5b7ab0212d77df4ce\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Jan 28 01:20:30.648878 containerd[1674]: time="2026-01-28T01:20:30.648836764Z" level=info msg="Container d476089002164f426c37dab83f0277f7cd8175a460b83185cb64336a2e0f8f99: CDI devices from CRI Config.CDIDevices: []" Jan 28 01:20:30.654217 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1614548310.mount: Deactivated successfully. Jan 28 01:20:30.655037 containerd[1674]: time="2026-01-28T01:20:30.654374937Z" level=info msg="Container f8cdc9e1f3d76845ffbcf47c8da3a07f5abbd90fbeb354837ba7ca481095d094: CDI devices from CRI Config.CDIDevices: []" Jan 28 01:20:30.661522 containerd[1674]: time="2026-01-28T01:20:30.661497373Z" level=info msg="CreateContainer within sandbox \"6f1059959f18e2bd6d121d76702a30dbac1942f0edec9d402362c5c6b49b4aec\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"d476089002164f426c37dab83f0277f7cd8175a460b83185cb64336a2e0f8f99\"" Jan 28 01:20:30.662159 containerd[1674]: time="2026-01-28T01:20:30.662144282Z" level=info msg="StartContainer for \"d476089002164f426c37dab83f0277f7cd8175a460b83185cb64336a2e0f8f99\"" Jan 28 01:20:30.663183 containerd[1674]: time="2026-01-28T01:20:30.663150593Z" level=info msg="connecting to shim d476089002164f426c37dab83f0277f7cd8175a460b83185cb64336a2e0f8f99" address="unix:///run/containerd/s/d5627cad212e84b5a23d4c3a68cb2084981c2fdad63af2ba006e7f2d29682b66" protocol=ttrpc version=3 Jan 28 01:20:30.667068 containerd[1674]: time="2026-01-28T01:20:30.666994485Z" level=info msg="CreateContainer within sandbox \"3ad4aab15ad1bec082c373d2e01cdc315cb022af293343f5b7ab0212d77df4ce\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"f8cdc9e1f3d76845ffbcf47c8da3a07f5abbd90fbeb354837ba7ca481095d094\"" Jan 28 01:20:30.667449 containerd[1674]: time="2026-01-28T01:20:30.667427773Z" level=info msg="StartContainer for \"f8cdc9e1f3d76845ffbcf47c8da3a07f5abbd90fbeb354837ba7ca481095d094\"" Jan 28 01:20:30.668209 containerd[1674]: time="2026-01-28T01:20:30.668188563Z" level=info msg="connecting to shim f8cdc9e1f3d76845ffbcf47c8da3a07f5abbd90fbeb354837ba7ca481095d094" address="unix:///run/containerd/s/76156d2b767e5dc73bde3a7de5908e721044acbed11f7f52a9455991f54d77d1" protocol=ttrpc version=3 Jan 28 01:20:30.686240 systemd[1]: Started cri-containerd-d476089002164f426c37dab83f0277f7cd8175a460b83185cb64336a2e0f8f99.scope - libcontainer container d476089002164f426c37dab83f0277f7cd8175a460b83185cb64336a2e0f8f99. Jan 28 01:20:30.689748 systemd[1]: Started cri-containerd-f8cdc9e1f3d76845ffbcf47c8da3a07f5abbd90fbeb354837ba7ca481095d094.scope - libcontainer container f8cdc9e1f3d76845ffbcf47c8da3a07f5abbd90fbeb354837ba7ca481095d094. Jan 28 01:20:30.705559 kernel: audit: type=1334 audit(1769563230.701:883): prog-id=257 op=LOAD Jan 28 01:20:30.701000 audit: BPF prog-id=257 op=LOAD Jan 28 01:20:30.704000 audit: BPF prog-id=258 op=LOAD Jan 28 01:20:30.708023 kernel: audit: type=1334 audit(1769563230.704:884): prog-id=258 op=LOAD Jan 28 01:20:30.704000 audit: BPF prog-id=259 op=LOAD Jan 28 01:20:30.710193 kernel: audit: type=1334 audit(1769563230.704:885): prog-id=259 op=LOAD Jan 28 01:20:30.704000 audit[5533]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a8238 a2=98 a3=0 items=0 ppid=2594 pid=5533 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:30.715438 kernel: audit: type=1300 audit(1769563230.704:885): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a8238 a2=98 a3=0 items=0 ppid=2594 pid=5533 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:30.704000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6434373630383930303231363466343236633337646162383366303237 Jan 28 01:20:30.704000 audit: BPF prog-id=259 op=UNLOAD Jan 28 01:20:30.704000 audit[5533]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2594 pid=5533 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:30.704000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6434373630383930303231363466343236633337646162383366303237 Jan 28 01:20:30.704000 audit: BPF prog-id=260 op=LOAD Jan 28 01:20:30.704000 audit[5540]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=3091 pid=5540 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:30.704000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6638636463396531663364373638343566666263663437633864613361 Jan 28 01:20:30.704000 audit: BPF prog-id=260 op=UNLOAD Jan 28 01:20:30.704000 audit[5540]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3091 pid=5540 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:30.704000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6638636463396531663364373638343566666263663437633864613361 Jan 28 01:20:30.704000 audit: BPF prog-id=261 op=LOAD Jan 28 01:20:30.704000 audit[5540]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=3091 pid=5540 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:30.704000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6638636463396531663364373638343566666263663437633864613361 Jan 28 01:20:30.704000 audit: BPF prog-id=262 op=LOAD Jan 28 01:20:30.704000 audit[5540]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=3091 pid=5540 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:30.704000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6638636463396531663364373638343566666263663437633864613361 Jan 28 01:20:30.704000 audit: BPF prog-id=262 op=UNLOAD Jan 28 01:20:30.704000 audit[5540]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3091 pid=5540 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:30.704000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6638636463396531663364373638343566666263663437633864613361 Jan 28 01:20:30.704000 audit: BPF prog-id=261 op=UNLOAD Jan 28 01:20:30.704000 audit[5540]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3091 pid=5540 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:30.704000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6638636463396531663364373638343566666263663437633864613361 Jan 28 01:20:30.704000 audit: BPF prog-id=263 op=LOAD Jan 28 01:20:30.704000 audit[5533]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a8488 a2=98 a3=0 items=0 ppid=2594 pid=5533 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:30.704000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6434373630383930303231363466343236633337646162383366303237 Jan 28 01:20:30.704000 audit: BPF prog-id=264 op=LOAD Jan 28 01:20:30.704000 audit[5533]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001a8218 a2=98 a3=0 items=0 ppid=2594 pid=5533 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:30.704000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6434373630383930303231363466343236633337646162383366303237 Jan 28 01:20:30.704000 audit: BPF prog-id=264 op=UNLOAD Jan 28 01:20:30.704000 audit[5533]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2594 pid=5533 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:30.704000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6434373630383930303231363466343236633337646162383366303237 Jan 28 01:20:30.704000 audit: BPF prog-id=263 op=UNLOAD Jan 28 01:20:30.704000 audit[5533]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2594 pid=5533 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:30.704000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6434373630383930303231363466343236633337646162383366303237 Jan 28 01:20:30.706000 audit: BPF prog-id=265 op=LOAD Jan 28 01:20:30.706000 audit[5533]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a86e8 a2=98 a3=0 items=0 ppid=2594 pid=5533 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:30.706000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6434373630383930303231363466343236633337646162383366303237 Jan 28 01:20:30.704000 audit: BPF prog-id=266 op=LOAD Jan 28 01:20:30.704000 audit[5540]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=3091 pid=5540 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:30.704000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6638636463396531663364373638343566666263663437633864613361 Jan 28 01:20:30.740821 containerd[1674]: time="2026-01-28T01:20:30.740733479Z" level=info msg="StartContainer for \"f8cdc9e1f3d76845ffbcf47c8da3a07f5abbd90fbeb354837ba7ca481095d094\" returns successfully" Jan 28 01:20:30.762056 containerd[1674]: time="2026-01-28T01:20:30.761996994Z" level=info msg="StartContainer for \"d476089002164f426c37dab83f0277f7cd8175a460b83185cb64336a2e0f8f99\" returns successfully" Jan 28 01:20:32.692357 kubelet[2910]: I0128 01:20:32.692185 2910 status_manager.go:895] "Failed to get status for pod" podUID="1d9bc333489a6a6279cb4f704c064614" pod="kube-system/kube-controller-manager-ci-4593-0-0-n-62761e1650" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.143:50244->10.0.0.202:2379: read: connection timed out" Jan 28 01:20:32.692357 kubelet[2910]: E0128 01:20:32.692157 2910 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.143:50152->10.0.0.202:2379: read: connection timed out" event="&Event{ObjectMeta:{kube-apiserver-ci-4593-0-0-n-62761e1650.188ec0609db31ead kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-ci-4593-0-0-n-62761e1650,UID:6b411eb2a0f2e7361dac8f8e13a6c7e4,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Readiness probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:ci-4593-0-0-n-62761e1650,},FirstTimestamp:2026-01-28 01:20:24.798338733 +0000 UTC m=+312.154190191,LastTimestamp:2026-01-28 01:20:24.798338733 +0000 UTC m=+312.154190191,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4593-0-0-n-62761e1650,}" Jan 28 01:20:33.689103 containerd[1674]: time="2026-01-28T01:20:33.688938600Z" level=info msg="container event discarded" container=146504ff689e90043ef8055ee6f9700149e2c205715c626fe0f93773a4fca2a9 type=CONTAINER_CREATED_EVENT Jan 28 01:20:33.689103 containerd[1674]: time="2026-01-28T01:20:33.689036175Z" level=info msg="container event discarded" container=146504ff689e90043ef8055ee6f9700149e2c205715c626fe0f93773a4fca2a9 type=CONTAINER_STARTED_EVENT Jan 28 01:20:33.747937 kubelet[2910]: E0128 01:20:33.747844 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-lkr4f" podUID="e60412d5-27c3-4569-9b64-5743c10cc437" Jan 28 01:20:33.773581 containerd[1674]: time="2026-01-28T01:20:33.773506592Z" level=info msg="container event discarded" container=f18da8e1cfe1ee3c5f541bc71c1fa4d5088e88c06dac6e43360be833102950e3 type=CONTAINER_CREATED_EVENT Jan 28 01:20:33.773581 containerd[1674]: time="2026-01-28T01:20:33.773554519Z" level=info msg="container event discarded" container=f18da8e1cfe1ee3c5f541bc71c1fa4d5088e88c06dac6e43360be833102950e3 type=CONTAINER_STARTED_EVENT Jan 28 01:20:34.749154 kubelet[2910]: E0128 01:20:34.749076 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-b767f6fff-w2bfk" podUID="3934179d-fb13-45a1-9643-cbd7ec08e773" Jan 28 01:20:35.885484 systemd[1]: cri-containerd-3d2963883752648312d4707aec27a87dbc756e5bf172de762e5ce01812457d01.scope: Deactivated successfully. Jan 28 01:20:35.886377 systemd[1]: cri-containerd-3d2963883752648312d4707aec27a87dbc756e5bf172de762e5ce01812457d01.scope: Consumed 2.675s CPU time, 23.3M memory peak, 232K read from disk. Jan 28 01:20:35.891407 kernel: kauditd_printk_skb: 40 callbacks suppressed Jan 28 01:20:35.891514 kernel: audit: type=1334 audit(1769563235.886:899): prog-id=267 op=LOAD Jan 28 01:20:35.886000 audit: BPF prog-id=267 op=LOAD Jan 28 01:20:35.895081 kernel: audit: type=1334 audit(1769563235.889:900): prog-id=93 op=UNLOAD Jan 28 01:20:35.889000 audit: BPF prog-id=93 op=UNLOAD Jan 28 01:20:35.895232 containerd[1674]: time="2026-01-28T01:20:35.893873669Z" level=info msg="received container exit event container_id:\"3d2963883752648312d4707aec27a87dbc756e5bf172de762e5ce01812457d01\" id:\"3d2963883752648312d4707aec27a87dbc756e5bf172de762e5ce01812457d01\" pid:2773 exit_status:1 exited_at:{seconds:1769563235 nanos:889885550}" Jan 28 01:20:35.894000 audit: BPF prog-id=108 op=UNLOAD Jan 28 01:20:35.898142 kernel: audit: type=1334 audit(1769563235.894:901): prog-id=108 op=UNLOAD Jan 28 01:20:35.898229 kernel: audit: type=1334 audit(1769563235.894:902): prog-id=112 op=UNLOAD Jan 28 01:20:35.894000 audit: BPF prog-id=112 op=UNLOAD Jan 28 01:20:35.926381 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-3d2963883752648312d4707aec27a87dbc756e5bf172de762e5ce01812457d01-rootfs.mount: Deactivated successfully. Jan 28 01:20:36.307996 containerd[1674]: time="2026-01-28T01:20:36.307820509Z" level=info msg="container event discarded" container=1faf0acc4cebe3dfbb3d6732ec6d55be8771db0a54de517751ff2d8d30c96cce type=CONTAINER_CREATED_EVENT Jan 28 01:20:36.386401 containerd[1674]: time="2026-01-28T01:20:36.386310323Z" level=info msg="container event discarded" container=1faf0acc4cebe3dfbb3d6732ec6d55be8771db0a54de517751ff2d8d30c96cce type=CONTAINER_STARTED_EVENT Jan 28 01:20:36.661906 kubelet[2910]: I0128 01:20:36.661084 2910 scope.go:117] "RemoveContainer" containerID="3d2963883752648312d4707aec27a87dbc756e5bf172de762e5ce01812457d01" Jan 28 01:20:36.665371 containerd[1674]: time="2026-01-28T01:20:36.665323774Z" level=info msg="CreateContainer within sandbox \"dd34bf28c83eb44ad4e782281d36d86b686f604095d4b27294bf6684fec53017\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Jan 28 01:20:36.678758 containerd[1674]: time="2026-01-28T01:20:36.678715474Z" level=info msg="Container 4c81026f3e2b95cb444363caf4e8c5dab3b1d4d66463b355ba75ec9896f3f9f8: CDI devices from CRI Config.CDIDevices: []" Jan 28 01:20:36.686956 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount275212140.mount: Deactivated successfully. Jan 28 01:20:36.691213 containerd[1674]: time="2026-01-28T01:20:36.691170913Z" level=info msg="CreateContainer within sandbox \"dd34bf28c83eb44ad4e782281d36d86b686f604095d4b27294bf6684fec53017\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"4c81026f3e2b95cb444363caf4e8c5dab3b1d4d66463b355ba75ec9896f3f9f8\"" Jan 28 01:20:36.692024 containerd[1674]: time="2026-01-28T01:20:36.691979741Z" level=info msg="StartContainer for \"4c81026f3e2b95cb444363caf4e8c5dab3b1d4d66463b355ba75ec9896f3f9f8\"" Jan 28 01:20:36.693312 containerd[1674]: time="2026-01-28T01:20:36.693252221Z" level=info msg="connecting to shim 4c81026f3e2b95cb444363caf4e8c5dab3b1d4d66463b355ba75ec9896f3f9f8" address="unix:///run/containerd/s/ae9a98da75e7cec6b1b8d39db82ff38110df4869442ab243b21c3737e64414d5" protocol=ttrpc version=3 Jan 28 01:20:36.718225 systemd[1]: Started cri-containerd-4c81026f3e2b95cb444363caf4e8c5dab3b1d4d66463b355ba75ec9896f3f9f8.scope - libcontainer container 4c81026f3e2b95cb444363caf4e8c5dab3b1d4d66463b355ba75ec9896f3f9f8. Jan 28 01:20:36.732000 audit: BPF prog-id=268 op=LOAD Jan 28 01:20:36.735048 kernel: audit: type=1334 audit(1769563236.732:903): prog-id=268 op=LOAD Jan 28 01:20:36.734000 audit: BPF prog-id=269 op=LOAD Jan 28 01:20:36.737063 kernel: audit: type=1334 audit(1769563236.734:904): prog-id=269 op=LOAD Jan 28 01:20:36.734000 audit[5615]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000174238 a2=98 a3=0 items=0 ppid=2641 pid=5615 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:36.734000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3463383130323666336532623935636234343433363363616634653863 Jan 28 01:20:36.743043 kernel: audit: type=1300 audit(1769563236.734:904): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000174238 a2=98 a3=0 items=0 ppid=2641 pid=5615 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:36.743116 kernel: audit: type=1327 audit(1769563236.734:904): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3463383130323666336532623935636234343433363363616634653863 Jan 28 01:20:36.736000 audit: BPF prog-id=269 op=UNLOAD Jan 28 01:20:36.746464 kernel: audit: type=1334 audit(1769563236.736:905): prog-id=269 op=UNLOAD Jan 28 01:20:36.746518 kernel: audit: type=1300 audit(1769563236.736:905): arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2641 pid=5615 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:36.736000 audit[5615]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2641 pid=5615 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:36.736000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3463383130323666336532623935636234343433363363616634653863 Jan 28 01:20:36.736000 audit: BPF prog-id=270 op=LOAD Jan 28 01:20:36.736000 audit[5615]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000174488 a2=98 a3=0 items=0 ppid=2641 pid=5615 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:36.736000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3463383130323666336532623935636234343433363363616634653863 Jan 28 01:20:36.736000 audit: BPF prog-id=271 op=LOAD Jan 28 01:20:36.736000 audit[5615]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000174218 a2=98 a3=0 items=0 ppid=2641 pid=5615 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:36.736000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3463383130323666336532623935636234343433363363616634653863 Jan 28 01:20:36.736000 audit: BPF prog-id=271 op=UNLOAD Jan 28 01:20:36.736000 audit[5615]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2641 pid=5615 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:36.736000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3463383130323666336532623935636234343433363363616634653863 Jan 28 01:20:36.736000 audit: BPF prog-id=270 op=UNLOAD Jan 28 01:20:36.736000 audit[5615]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2641 pid=5615 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:36.736000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3463383130323666336532623935636234343433363363616634653863 Jan 28 01:20:36.736000 audit: BPF prog-id=272 op=LOAD Jan 28 01:20:36.736000 audit[5615]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001746e8 a2=98 a3=0 items=0 ppid=2641 pid=5615 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:36.736000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3463383130323666336532623935636234343433363363616634653863 Jan 28 01:20:36.790029 containerd[1674]: time="2026-01-28T01:20:36.789559445Z" level=info msg="StartContainer for \"4c81026f3e2b95cb444363caf4e8c5dab3b1d4d66463b355ba75ec9896f3f9f8\" returns successfully" Jan 28 01:20:38.279943 containerd[1674]: time="2026-01-28T01:20:38.279846700Z" level=info msg="container event discarded" container=b4b8e077e959069c5b614169d0a4147dfb7ed148a12d688842b10d961f3dbf2c type=CONTAINER_CREATED_EVENT Jan 28 01:20:38.404266 containerd[1674]: time="2026-01-28T01:20:38.404176802Z" level=info msg="container event discarded" container=b4b8e077e959069c5b614169d0a4147dfb7ed148a12d688842b10d961f3dbf2c type=CONTAINER_STARTED_EVENT Jan 28 01:20:38.518630 containerd[1674]: time="2026-01-28T01:20:38.518554698Z" level=info msg="container event discarded" container=b4b8e077e959069c5b614169d0a4147dfb7ed148a12d688842b10d961f3dbf2c type=CONTAINER_STOPPED_EVENT Jan 28 01:20:38.748781 kubelet[2910]: E0128 01:20:38.748720 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-79f9cd9ddf-ggx6z" podUID="72726466-f235-4a31-a84a-a3699d8c85f7" Jan 28 01:20:39.747667 kubelet[2910]: E0128 01:20:39.747620 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76f584f9b9-6dh2c" podUID="bead6395-8434-48df-aa67-e987782da70c" Jan 28 01:20:40.265214 kubelet[2910]: E0128 01:20:40.265088 2910 controller.go:195] "Failed to update lease" err="Put \"https://10.0.0.143:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4593-0-0-n-62761e1650?timeout=10s\": context deadline exceeded" Jan 28 01:20:40.749049 kubelet[2910]: E0128 01:20:40.748970 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-cblpt" podUID="c22de3ae-0a27-443f-9dd3-c4ab0a4176bd" Jan 28 01:20:40.750664 kubelet[2910]: E0128 01:20:40.750612 2910 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76f584f9b9-9mjk9" podUID="7ea90b44-fc7d-4702-a1a5-1c558b3ecd80"